1 / 21

Overview of last section

Overview of last section. In fact, the term of data assimilation cycle can be thought of as having four subcomponents: 1 Quality control (data checking) 2 Objective analysis 3 Initialization 4 Short forecast to prepare next background field. We will focus on 2.

meara
Download Presentation

Overview of last section

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Overview of last section In fact, the term of data assimilation cycle can be thought of as having four subcomponents: 1 Quality control (data checking) 2 Objective analysis 3 Initialization 4 Short forecast to prepare next background field We will focus on 2 SCM, OI, 3D-Var, 4D-Var, and KF

  2. 5.2 Quality control of observation • The observation data we used contain several kinds of error • Instrument error • Errors of representativeness • Errors of human origin • The use of an observation with rough error can caused a disproportionally big error in analysis, so there has been a tendency to use observations conservatively.

  3. Early quality control Earlier quality control system were based on several checks performed in series (one after another) before the analysis. For example Dimego et al. 1985 • First, gross check compared each observation with a climatological distribution to see whether it was within a reasonable range. • If OK, then Buddy check compared it with the average of nearby observation

  4. OI quality control • Like NCEP

  5. Quality control preformed within 3D-Var and 4-D Var • Like ECMWF

  6. 5.3 Empirical analysis schemes • 5.3.1 Successive corrections method (SCM) • The first analysis method used in 4DDA was based on an empirical approach known as the SCM which is developed by Bergthorson and Doos (1955) in Sweden and by Cressman (1959) of US Weather Service. • is the background field evaluated at the ith grid point, and the corresponding zeroth iteration estimate First estimate

  7. The following iterations are obtained by “ successive corrections ” is the nth iteration estimation at the grid point i, is the kth observation surrounding the grid point i, is the value of the nth field estimate evaluated at the observation point k, is an estimate of the ratio of the observation error variance to the background error variance. is the number of observation within a distance Rn of the grid point i.

  8. The weights can be defined in different ways. Rn Cressman (1959) defined the weights in SCM as For For

  9. In 1980s, the Swedish operational system used R1=1500 km, R2=900 for upper air analysis And R1=1500 km, R2=1200km, R3=750km, R4=300km for surface pressure analysis. The reduction of the radius of influence results in a field that reflects the large scales after the first iteration and converges towards the smaller scales after the additional iterations.

  10. In Cressman SCM, the coefficient is assumed to be zero. This results in a “credulous” analysis that more faithful reflects the observations, and for a very small radius of influence the analysis converged to the conservation values if the observations are located at the grid points. If the data are noisy ( if an observation has gross errors, or if it contains an unrepresentative sample of subgrid-scale variability), this can be lead to “bull’s eye’ (many isolines around an unrealistic grid-points value) in the analysis. Including assumes that the observation have errors, and gives some weight to the background field.

  11. Barnes (1964, 1978) developed another empirical version of the SCM that has been widely used for analysis where there is no available background or first guess field, such as analysis of radar data or other small-scale observations. The weights are given by The radii of influence are changed by a constant factor at each iteration If only the large scales are captured. For more details in the observations are reproduced in the analysis as more iteration are performed.

  12. Although the SCM method is empirical, it is simple and economical, and it provides reasonable analyses. Bratseth (1986) showed that if the weights are chosen appropriately instead of using the empirical formulas presented above, the SCM can be made to converge to a proper statistical interpolation (OI).

  13. Home work • Compare For Cressman For Barnes Assume a Rn

  14. 5.3.2 Nudging • Another empirical and fairly widely used method • “Nudge”---- to push gently, usually with one’s elbow, especially for one’s attention • This consists of adding to the prognostic equations a term that nudges towards the observations (interpolated to model grid).

  15. Example of nudging in a primitive model relaxation time scale, is chosen based on empirical considerations and may depend on the variable.

  16. Weakness of nudging • If is very small, the solution converges towards the observations too fast, and the dynamic do not have enough time adjust. • If is too large, the errors in the model can grow too much before the nudging become effective. • Although this method is not generally used for large-scale assimilation, some groups use it for assimilating small-scale observations (e. g. radar observations) when there are no available statistical interpolation.

  17. 5.4 Introduction to least squares methods • In this section, we present the method based on statistic estimate theory. • Assimilation of meteorology or oceangraphical observations can be describe as the process through which all the available information is used in order to estimate as accurately as possible the state of the atmosphere or oceanic flow. The available information essentially consist of the observations proper, and of the physical laws that govern the evolution of the flow. The later are available in practice under the form of a numerical model. The existing assimilation algorithms can be described as either sequential or variational. (after Talagrand 1997)

  18. 5.4.1 Least squares method • The best estimate of the state of the atmosphere (analysis) is obtained from combining prior information about the atmosphere ( background or first guess) with observations, but in order to combine them optimally we also need statistical information about the errors in these “piece of information”. • We will give a “baby example”

  19. Give 2 independent observations (T1 and T2) of true temperature Tt We know: We do not know: Which is the errors of observations If E( ) represent the expected value, i.e. the average that one would obtain if making similar measurements. We also know:

  20. Then, we try to estimate Tt from a linear combination of the two observations since they represent all the information that we have about the true value of T Analysis temperature ? Which means Ta will be the best estimate of Tt if the coefficients are chosen to minimize the mean squared error of Ta

  21. With respect to a1 The minimize of Then we have

More Related