1 / 17

Generic Approaches to Model Validation Presented at Growth Model User’s Group August 10, 2005

Generic Approaches to Model Validation Presented at Growth Model User’s Group August 10, 2005. David K. Walters. Phases in a Modeling Project. Model Identification. Model Fitting with Data. Component Model – Equation Forms. Where does validation fit in?.

aadi
Download Presentation

Generic Approaches to Model Validation Presented at Growth Model User’s Group August 10, 2005

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Generic Approaches to Model ValidationPresented atGrowth Model User’s GroupAugust 10, 2005 David K. Walters

  2. Phases in a Modeling Project Model Identification Model Fitting with Data Component Model – Equation Forms

  3. Where does validation fit in? • Ideal Case – as an integrated component of the model development process as a feedback mechanism • Reality – • Best Case - done once by modeler using a subset of the modeling data (or other, related techniques), then up to the user. Feedback is up to the persistence of the user and the receptiveness of the modeler. Not integrated… • Probable Case - done once by modeler using a subset of the modeling data (or other, related techniques), then up to the user. Modeler takes a new job, moves on…

  4. Of what benefit is validation? • Increase Comfort • The user better understands the situations in which the model can be reliably applied and those situations in which it cannot. • Model Improvements – facilitates calibration • To make a model applicable to a new situation • different treatments/regions/situations • different “scale” • Over/under runs – utilization issues • To weight model output with other data for the purpose of decision making (weighting usually requires some estimate of variability)

  5. Validating the overall appropriateness • Is the model flexible enough to reproduce desired management alternatives? • Does it provide sufficient detail for decision-making? • How efficient is the model in meeting these goals? Whole Stand Models Individual Tree Models Process Models Distance Dependent - Independent Model Type & Resolution Everything should be made as simple as possible, but not simpler --Albert Einstein

  6. Validating a Model – Check the data Spatially Differences in Data Populations Culturally Temporally Some Research data may be collected with such a high degree of caution that resultant models will tend to overestimate growth and yield.

  7. Validating the Component Models • Model Component Specification • Equation “forms” – reasonable, consistent with established theory and/or user’s expectations. • Statistical, or other, “fitting” of the component equations

  8. Validating the Implementation – the computer software • Software Implementation • Bugs • Adequacy of outputs / interface • Efficient

  9. A couple of random thoughts – what else might make a model invalid? • Homogeneity – very few models operationally project plots. Most project stands. Stands are assumed to be homogenous with respect to exogenous or predictor variables. But....they are really full of Holes

  10. …misuse • Matching Data Inputs with Model Specifications • Site Index • DBH Thresholds – all versus “merchantable” or other subsets of trees • Others

  11. Statistics • So, we get to the point of wishing to conduct a data-based validation of some kind. • what do we compare? Real Data vs. Predicted Data • Tree Variables – DBH, Height, Crown, Volume • Stand Variables – QMD, TPA, BA, Volume If using Volume when comparing multiple models…make sure the volume equations are identical

  12. What statistics to use?

  13. The overall project - two Approaches • Case 1 – We have repeat measurements (growth data) • Using the observed inputs, run the real data through the model. Look at time 2(or 3, etc.) predicted versus real. Calculate Statistics • Case 2 – No repeat measurements • Simulation Study • Identify matrix of input variables (Site, density, stocking ,etc.) that cover the range of interest. • Run model for each row of input matrix

  14. Validation – Patterns and Trends • In either case, you will want to look for trends, how the predictions (or residuals if you have real data) change. Examples, • MAI over time • TPA over time • Results vs. predictor variables (Site, treatment, density) • How do Long-term predictions compare to “laws” • Self-thinning, etc. • How does the model prediction compare to other models

  15. In Summary, • Identify the alternative “models”, establish a frame of reference • Examine the big picture • Look at the sample used in the model calibration, the presumed population, and a sample of “your” population. • Identify the key component models. Compare predictions with data – bias and accuracy. Examine these for trends against appropriate factors • Look at the overall model output…the computer code. Are there errors? Evaluate output with data (volume per acre – aggregated variables)

  16. Final Points Remember, there is always an alternative model. When evaluating a model, give careful thought to the alternative. How well a model performs in relation to the alternative is generally the most relevant question. Validity is relative, as are other things.

  17. Questions? All you need in this life is ignorance and confidence -- and then success is sure. --Mark Twain

More Related