Create Presentation
Download Presentation

Download Presentation

Observing System Design and Targeted Observing

Download Presentation
## Observing System Design and Targeted Observing

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -

**Observing System Design and Targeted Observing**Carolyn Reynolds, Naval Research Laboratory, Monterey, CA • II. Targeted Observing • Current techniques and programs • Predicting data impact • Issues • Validation and sampling • Nonlinearities • Model error • I. Observing system design • Simple model results: • Effective observing networks • Impact of time-dependent basic state • Considerations for Operational Systems Thanks to Craig Bishop, Rolf Langland, Nancy Baker**For Q=0, Pf takes form of outer product of transformed**eigenvectors of M. Eigenvalues of M & projection of eigenvectors onto R-1 determine transform Only growing normal modes required to precisely represent error covariances. Rank of error covariance much smaller than d.f. of M. Does not hold with Q0 nor with time evolving M. For time-independent dynamics and observation operators M and H, Kalman filter covariances become time-invariant: Observing System Design: Idealized System Bishop, Reynolds, Tippett: JAS 2003**Effective Observing Network: Simple Global Model (1449**degrees of freedom) • Given current observing network, find location of additional column observation that minimizes trace (Pf) • Repeat**Effective Observing Networks: Simple Global Model**“Effective” observing network produces forecast error variances several times smaller than other types of networks**Effective Observing Networks: Simple Global Model**Relative difference between observing networks decreases as networks become more dense.**Effective Observing Networks: Cost Function**Minimizing trace(Pf) will give difference locations for different forecast lengths. For 12-hour forecasts: 23 obs out of 150 at 500 mb For 72-hour forecasts: 33 obs out of 150 at 500 mb**Effective Observing Networks: Cost Function**Place observations at one level only. Which level to pick? 200-mb obs (top) give lower trace(Pf) than 500-mb obs (middle). However, if interest is in jet regions, 500-mb obs are better. How do you pick forecast error component to minimize?**Effective Observing Networks: Function of M**Effective observing networks will change with the basic state. Real atmospheric state changes quickly Adaptive observing: use (time-dependent) dynamics to inform configuration of adaptive component of observing network What does real observing network look like? Zonal Blocked**FNMOC NOGAPS DATA ASSIMILATION:**RADIOSONDES Radiosondes provide atmospheric profiles with high vertical resolution. Very few radiosondes over ocean basins. (Arlene dropsondes in pink).**Dynamic Amplification of Perturbations: Forecasting is a**Global Problem**FNMOC NOGAPS DATA ASSIMILATION:**Buoy Coverage Surface observations over the ocean: Buoys.**FNMOC NOGAPS DATA ASSIMILATION:**Ship/Coastal Coverage Surface observations over the ocean: Ships.**FNMOC NOGAPS DATA ASSIMILATION:**Meteorological Data Collection and Reporting System**FNMOC NOGAPS DATA ASSIMILATION:**International Aircraft Meteorological Data Report**FNMOC NOGAPS DATA ASSIMILATION:**DMSP Special Sensor Microwave/Imager: Ocean Surface Winds Surface observations over the ocean: Surface wind speeds from satellite.**FNMOC NOGAPS DATA ASSIMILATION:**CIMSS/Univ. of Wis., Feature Tracked Winds Coverage Feature-track winds provide some upper-level data over oceans.**FNMOC NOGAPS DATA ASSIMILATION:**Advanced Microwave Sounding Unit: All Data Steve Swadley, NRL Assimilation of satellite radiances can provide temp and humidity profiles. 67,000 observations from one satellite for 1 channel in 1 3-hr interval.**FNMOC NOGAPS DATA ASSIMILATION:**Advanced Microwave Sounding Unit: Data after thinning and QC Steve Swadley, NRL After data thinning and QC, reduced from 67,000 to 2,000. Is there a better way to perform spatial thinning?**FUTURE DATA ASSIMILATION:**Amount of satellite data is increasing dramatically New Satellite Sounder (hyperspectral) Current Satellite Sounder – AMSU-A 8 channels currently assimilated Select ~10-100 channels out of 8000 to be assimilated**NOGAPS1 ADJOINT**NAVDAS2 ADJOINT Selective Thinning of Data: OSEs, OSSEs, Observation Sensitivity • Observing System Experiments (OSEs): Data denial experiments. Can be very useful, but expensive. • Observing System Simulation Experiments: Run analysis-forecast cycle with and without simulated observations. Tests hypothetical data, but need accurate error statistics. • Observation Sensitivity: Use the adjoints of the forecast model and data assimilation system to find sensitivity of forecast errors to observations. Efficient but assumes perfect model and linear error growth.**Observation Sensitivity using the Adjoint of the DA System**ef = <(xf – xt), C(xf – xt)> ; Jf = ef / 2 Jf/ xf = C(xf – xt) Jf/ xa = LT Jf/ xf Jf/ y=[HPbHT + R]-1HPb Jf/ xa xa – xb = PbHT[HPbHT + R]-1 (y – Hxb) Observation sensitivity (Jf/ y)is used to estimate how forecast error is changed by adding small perturbations to actual or hypothetical observations. 2 gradient calculations can be used to estimate the impact of observations on the reduction in forecast errors between 48-h and 42-h forecasts. Baker and Daley 2000, Langland and Baker, 2004**Data Selection (Intelligent Thinning of Satellite Data)**December 2003 CHANNEL # Global Forecast Error Reduction (J kg-1) 42-h Forecast Error minus 48-h Forecast Error 8 8 5 5 7 6 6 NOAA-15 NOAA-16 Efficient ways to estimate data impact critical for intelligent selection of satellite data Langland and Baker**Observing System Design**• Effective observing network design will be: • A function of metric • A function of the dynamics • Daily • Regime (blocked vs zonal) • Seasonal or interannual • Need efficient ways to select satellite data**Observing System Design and Targeted Observing**• II. Targeted Observing • Key components of analysis error • Current techniques and programs • Predicting data impact • Issues • Validation and sampling • Nonlinearities • Model error**Forecast error corrections using SV-based pseudo-inverse**Meo = ef M = UDVT eo = M-1ef = V D-1UTef Compose Pseudo-inverse of 3 leading SVs to find fast-growing component of initial perturbation eo3= V3 D3-1U3Tef = vkdk-1 < uk;ef > Subtracteo3from analysis then run nonlinear corrected forecast Compare the nonlinear corrected forecast with “linear” corrected forecasteflin = ef - uk< uk;ef > Forecast Errors and Key Initial Perturbations: Singular Vectors and Pseudo-inverse Corrections k=1,3 k=1,3 Do SVs look like forecast errors? Can pseudo-inverse initial perturbations improve forecasts?**Large Forecast Error**Expected Linear Correction Actual Nonlinear Correction Forecast Errors and Key Initial Perturbations: Singular Vectors and Pseudo-inverse Corrections Small changes to initial state can result in significant error reduction. What is the impact of additional observations in these key regions?**Adaptive Observing: Simple-model Experiments**Lorenz and Emanuel (1998): better to place supplemental observations where analysis errors are greatest, rather than where forecast is most sensitive. Hansen and Smith (2000): Dynamical guidance useful for adaptive observations as long as linear assumption valid. Morss et al (2001): Bigger impact from adaptive observations in sparse networks than in dense networks. Combine information about dynamics and initial errors, analysis error covariance singular vectors: Ehrendorfer and Tribbia (1997).**Optimal Perturbation Growth: Day-to-Day Changes**<Lpt;BLpt> <p0;Ap0> 10 Jan 13 Jan 11 Jan 14 Jan 12 Jan 15 Jan**Atmospheric Adaptive Observing Techniques**• Ensemble Transform Kalman Filter (Bishop et al. 2001) • Use ensembles to construct approximate error covariances • Assess various adaptive observing configurations for hypothetical reduction in forecast error variance • Singular Vectors • Add additional observations to “sensitive” regions • Sensitive to metric: Approximations to Pa-1 • Hessian SVs (ECMWF, Barkmeijer et al. 1998) • Variance SVs (NRL, Gelaro et al. 2002) • Analysis Error Covariance SVs (Hamill et al. 2003) • Observation Sensitivity (Baker and Daley 2000) • Use adjoint of forecast model and data assimilation system to find sensitivity of forecasts to changes in the observations.**Use Ensemble Transform Kalman Filter (ETKF) for quantitative**estimate of forecast error variance reduction Signal of obs associated with obs operator H at forecast time t is xa(t)-xf(t)=L(t,ta)PfHT(HPfHT+R)-1[y-Hxf(ta)] Observation error covariance matrix First guess error covariance matrix Forecast without obs observations Forecast with obs Tangent linear propagator Innovation vector covariance matrix Observation operator Innovation vector Ensemble Transform Kalman Filter • Signal Covariance: S(t|H)=L(t,ta)PfHT(HPfHT+R)-1HPfL(t,ta)T • Prediction Error Covariance: P(t|H)=L(t,ta)PfL(t,ta)T+Q-S(t|H) where Q is model error. Thus: • SIGNAL VARIANCE = REDUCTION IN ERROR VARIANCE Bishop et al. (2001)**Adaptive Observing: ETKF for Winter Storm Reconnaissance**Signal variance for flight track 23 is largest. Signal variance for flight track 46 is largest assuming track 23 observations already assimilated. Majumdar et al. 2002**Previous targets based on forecast sensitivity to changes in**analysis (xa). Now also use analysis sensitivity to changes in observations (y). Quantify expected impact of additional observations on forecast error variance: Examine impact of hypothetical data distributions and observing platforms NOGAPS1 ADJOINT NAVDAS2 ADJOINT Observation operator Background error covar Observation error covar Observation Sensitivity using the Adjoint of the DA System 1Navy Operational Global Atmospheric Prediction System 2NRL Atmospheric Variational Data Assimilation System Baker and Daley, 2000**Adaptive Observation Products for THORPEX**Variance singular vector target regions (top) usually match high observation sensitivity (right).**Atmospheric Adaptive Observing Programs**• FASTEX (1997) • NORPEX (1998) • Winter Storm Reconnaissance (ongoing) • Atlantic THORPEX Regional Campaign ( 2003): dropsondes, off-time Radiosonde, additional commercial aircraft, etc. • Hurricane Research Division (ongoing) • DOTSTAR (Taiwan typhoon adaptive observing)**Adaptive Observing Products for Tropical Cyclones**Majumdar, Abrams, Bishop, Buizza, Peng, Reynolds**Adaptive Observing Products for Tropical Cyclones**Majumdar, Abrams, Bishop, Buizza, Peng, Reynolds**Adaptive Observing Products for Tropical Cyclones**Majumdar, Abrams, Bishop, Buizza, Peng, Reynolds**Issues: Validation**How do we assess “optimality” of adaptive deployment? Can we predict observation impact on forecast error variance? Need large sample for validation: ef = M e0 M Pa MT = Pf Langland et al, 1999**Validation: Predicting Observation Impact with ETKF**On average, larger ETKF predicted signal variance does correspond to larger forecast signal. Would spread-skill relationship improve if using ensemble transform data assimilation system (instead of 3DVAR?) Majumdar et al. 2002**Target Time**Verification Time Squared Forecast Signal ETKF Signal Variance Validation: Predicting Observation Impact with ETKF Evaluate ET KF signal variance prediction when using data assimilation scheme with flow-dependent covariances. Bishop, Majumdar, and Toth**Validation: Observation Sensitivity**December 2003: Rawinsondes, Dropsondes Impact on 42-h Forecast error These results show the impact of the different soundings on forecast error reduction during Atlantic-THORPEX regional campaign. Rolf Langland**Observation Type**Err. Red. (J kg-1) % of total # obs Err. Red. per ob (10-5 J kg-1) Aircraft -17.54 46.3% 1,658,355 -1.1 AMSU-A -5.86 15.5% 739,547 -0.8 Geosat winds -5.18 13.6% 621,526 -0.8 Land-surface -3.53 9.3% 304,766 -1.2 Rawinsondes -3.06 8.1% 202,522 -1.5 Ship-surface -2.04 5.4% 98,796 -2.1 Dropsondes -0.67 1.8% 13,418 -5.0 Total -37.88 100% 3,638,930 -1.0 Observation Impact: Atlantic THORPEX Regional Campaign Cumulative observation impact on 48-h Forecast Error (J kg-1) from observations assimilated in NAVDAS at 1800 UTC in the NA-TReC domain (10°N-70°N, 100°W-40°E) from 1 November to 31 December 2003. Rolf Langland**250-hPa U: 00Z 03 Feb**Adaptive Observations 250-hPa U: 00Z 04 Feb Commercial Aircraft Observations Validation: Consistency of target method and DA-Forecast system Ideally, observations would not just change analysis, but would change construction of the initial-time ensemble perturbations Validation should encompass impact on ensemble spread**Issues: Linearity Assumption:**Nonlinear evolution of positive and negative SV perturbations (solid). Sum of positive and negative perturbations (dashed). At 12 (red), 24 (blue) and 48 (brown) hours. Nonlinear perturbation growth far more significant on smaller scales. At short forecast times, nonlinearities are primarily due to diabatic processes.**||+ + -||**0.5(||+|| + ||-||) = Issues: Linearity Assumption Gilmour and Smith At 12 and 24 hours, larger perturbations have smaller relative nonlinearities. Decrease in relatively nonlinearities between 12 and 24 hours.**Issues: Linearity Assumption:**Nonlinearities increase with perturbation size for adiabatic perturbations. Large nonlinearities a function of diabatic processes.**Issues: Model Error**Model errors and model differences behave quite differently, especially in tropics.**Adaptive Observing and Observing System Design: Outstanding**Issues • Quantitative estimates of observation impact on forecast error variance (are methods consistent with DA and ensemble forecasting system?) • Sampling issues • Limitations of linear assumption • Limitations of perfect model assumption • Efficient ways to selectively thin satellite data? (will it matter?)**Issues: Linearity Assumption**Hurricane Singular Vectors: Positive and Negative Perturbation 850-mb Vorticity Linear Perturbations Nonlinear Perturbations In full nonlinear forecasts, perturbations alter the basic state. Positive and negative perturbations approximately symmetric but exhibit phase shift.