ci verification methodology preliminary results n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
CI Verification methodology & preliminary results PowerPoint Presentation
Download Presentation
CI Verification methodology & preliminary results

Loading in 2 Seconds...

play fullscreen
1 / 33

CI Verification methodology & preliminary results - PowerPoint PPT Presentation


  • 78 Views
  • Uploaded on

CI Verification methodology & preliminary results. lakshman@ou.edu. In short:. Find observed CI using radar echoes aloft Compare to CI forecasts from UAH and UW Find hits, misses, false alarms Preliminary results Discussion. 1. How observed CI was determined. From radar data aloft.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'CI Verification methodology & preliminary results' - rune


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
in short
In short:
  • Find observed CI using radar echoes aloft
  • Compare to CI forecasts from UAH and UW
  • Find hits, misses, false alarms
  • Preliminary results
  • Discussion
observed ci
Observed CI
  • For verification purposes, need a “truth” field
    • Independent of way in which CI is detected
    • Not tied to “objects”
  • Based on multi-radar reflectivity at -10C isotherm
    • Reflectivity aloft, associated with graupel formation
    • Good indication on convection
    • Less contaminated by clutter, biological echoes
      • The multi-radar reflectivity is QC’ed, but QC is not perfect
reflectivity at 10c on 4 4 2011
Reflectivity at -10C on 4/4/2011
  • Approx. 1km resolution over CONUS
classifying ci
Classifying CI
  • Define convection as:
    • Reflectivity at -10C exceeds 35 dBZ
  • New convection:
    • Was below 35 dBZ in previous image
    • Images are 5 minutes apart
  • Done on a pixel-by-pixel basis
    • But allow for growth of ongoing convection
model verification
Model verification
  • The CI detection algorithm is now running realtime
    • Being used to verify NSSL-WRF model forecasts of CI
aside model verification
Aside: model verification
  • Probability of CI in one hour very similar
    • But time evolution different
methodology
Methodology
  • Take image at t0 and warp it to align it with the image at t1
    • Warping limited to a 5 pixel movement
    • Determined by cross-correlation with a smoothness constraint imposed on it
    • 5 pixels in 5 min  60kmph maximum movement
  • Then, do a neighborhood search
    • Pixels above 35 dBZ with no pixel above 35 dBZ within 3km of aligned image is “New Convection”
definition of observed ci
Definition of Observed CI
  • Computed CI using 4 different distance thresholds:
    • 3 km (as described)
    • 5 km
    • 15 km
    • 25 km
  • The 15 km threshold means that a new CI pixel would have to be at least 15 km from existing convection to considered new
    • In the HWT, this is what forecasters tended to like
    • What I will use for scoring
significant cells
Significant cells?
  • One possible problem is that even one pixel counts as CI
    • So, also tried to look for at least 13 km^2 cells
  • This will be called ObservedCIv2
    • Tends to find only significant cells (or cells after they have grown a little bit).
    • Started doing this after some feedback on this point
      • Not available for all days
      • Can go back and recompute, but doesn’t seem to make much difference to final scores
2 comparing observed to forecast
2. Comparing Observed to Forecast

By finding distance between centroids

computing distance
Computing distance
  • Take the ObservedCI, SatCast and UWCI grid points
    • Find contiguous pixels and call it an object
    • Find centroid of those objects
  • Use storm motion derived from radar echoes and model 500mb wind field
  • Compute distance between each ObservedCI centroid and each forecast CI centroid
distance computation
Distance computation
  • Distance is computed as follows:
    • If observed CI is outside time window of forecast CI (-15 to +45 min), then dist=MAXDIST
    • Project forecast CI to time of observed CI
      • Using storm motion field
    • Compute Euclidean distance in lat-lon degrees
  • MAXDIST was set to be 100 km
    • Pretty generous
3 scoring
3. Scoring

Two ways: Hungarian match and distance

scoring hungarian match
Scoring: Hungarian Match
  • Create cost matrix of distance between each pair
    • Observed CI to forecast CI
  • Find best association for each centroid to minimize global sum-of-distances
  • Any associated pair is a hit
  • Any unassociated observed CI is a miss
  • Any unassociated forecast CI is a false alarm
scoring neighborhood match
Scoring: Neighborhood Match
  • Consider each observed CI
    • If there is any forecast CI within MAXDIST, then it is a hit
    • Otherwise, it is a miss
  • Consider each forecast CI
    • If there is no observed CI within MAXDIST, then it is a miss
  • More generous than the Hungarian Match
    • Since multiple forecasts can be verified by a single observation
summary of numbers that matter
Summary of numbers that matter
  • Observed CI:
    • 35 dBZ
    • 5 pixel warp in 5 minutes
    • 15 pixel isolation for new CI
  • Significant cells area threshold (ObservedCIv2)
    • 13 km^2
  • Time Window:
    • -15 min to +45 min
  • Distance threshold:
    • Hits have to be within 100 km
4 preliminary results
4. Preliminary results

Real time images and daily scores

real time
Real time
  • Can see ObservedCI, ObservedCIv2, UAH and UWCI algorithms at:
  • http://wdssii.nssl.noaa.gov/web/wdss2/products/radar/civer.shtml
verification dataset
Verification dataset
  • Dataset of centroids over Spring experiment is available at:
      • ftp://ftp.nssl.noaa.gov/users/lakshman/civerification.tgz
  • Contains:
    • All ObservedCI, SatCast and UWCI centroids
    • ObservedCIv2 for when we started creating them
    • Results of matching and skill scores by day
example result for june 10 2011
Example result for June 10, 2011
  • UAH
  • UWCI
  • These scores are typical
possible reason for low values
Possible reason for low values
  • Could be a factor of the cirrus mask
  • Computing scores without taking the mask into account is problematic
    • Because mask is so widespread, most radar-based CI happens under the mask