generic approaches to model validation presented at growth model user s group august 10 2005 n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Generic Approaches to Model Validation Presented at Growth Model User’s Group August 10, 2005 PowerPoint Presentation
Download Presentation
Generic Approaches to Model Validation Presented at Growth Model User’s Group August 10, 2005

Loading in 2 Seconds...

play fullscreen
1 / 17

Generic Approaches to Model Validation Presented at Growth Model User’s Group August 10, 2005 - PowerPoint PPT Presentation


  • 100 Views
  • Uploaded on

Generic Approaches to Model Validation Presented at Growth Model User’s Group August 10, 2005. David K. Walters. Phases in a Modeling Project. Model Identification. Model Fitting with Data. Component Model – Equation Forms. Where does validation fit in?.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Generic Approaches to Model Validation Presented at Growth Model User’s Group August 10, 2005' - aadi


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
generic approaches to model validation presented at growth model user s group august 10 2005

Generic Approaches to Model ValidationPresented atGrowth Model User’s GroupAugust 10, 2005

David K. Walters

phases in a modeling project
Phases in a Modeling Project

Model Identification

Model Fitting with Data

Component Model – Equation Forms

where does validation fit in
Where does validation fit in?
  • Ideal Case – as an integrated component of the model development process as a feedback mechanism
  • Reality –
    • Best Case - done once by modeler using a subset of the modeling data (or other, related techniques), then up to the user. Feedback is up to the persistence of the user and the receptiveness of the modeler. Not integrated…
    • Probable Case - done once by modeler using a subset of the modeling data (or other, related techniques), then up to the user. Modeler takes a new job, moves on…
of what benefit is validation
Of what benefit is validation?
  • Increase Comfort
    • The user better understands the situations in which the model can be reliably applied and those situations in which it cannot.
  • Model Improvements – facilitates calibration
    • To make a model applicable to a new situation
      • different treatments/regions/situations
      • different “scale”
    • Over/under runs – utilization issues
  • To weight model output with other data for the purpose of decision making (weighting usually requires some estimate of variability)
validating the overall appropriateness
Validating the overall appropriateness
  • Is the model flexible enough to reproduce desired management alternatives?
  • Does it provide sufficient detail for decision-making?
  • How efficient is the model in meeting these goals?

Whole Stand Models

Individual Tree Models

Process Models

Distance Dependent - Independent

Model Type & Resolution

Everything should be made as simple as possible, but not simpler --Albert Einstein

validating a model check the data
Validating a Model – Check the data

Spatially

Differences in

Data Populations

Culturally

Temporally

Some Research data may be collected with such a high degree of caution that resultant models will tend to overestimate growth and yield.

validating the component models
Validating the Component Models
  • Model Component Specification
    • Equation “forms” – reasonable, consistent with established theory and/or user’s expectations.
    • Statistical, or other, “fitting” of the component equations
validating the implementation the computer software
Validating the Implementation – the computer software
  • Software Implementation
    • Bugs
    • Adequacy of outputs / interface
    • Efficient
a couple of random thoughts what else might make a model invalid
A couple of random thoughts – what else might make a model invalid?
  • Homogeneity – very few models operationally project plots. Most project stands. Stands are assumed to be homogenous with respect to exogenous or predictor variables.

But....they are really full of

Holes

misuse
…misuse
  • Matching Data Inputs with Model Specifications
    • Site Index
    • DBH Thresholds – all versus “merchantable” or other subsets of trees
    • Others
statistics
Statistics
  • So, we get to the point of wishing to conduct a data-based validation of some kind.
    • what do we compare? Real Data vs. Predicted Data
      • Tree Variables – DBH, Height, Crown, Volume
      • Stand Variables – QMD, TPA, BA, Volume

If using Volume when comparing multiple models…make sure the volume equations are identical

the overall project two approaches
The overall project - two Approaches
  • Case 1 – We have repeat measurements (growth data)
    • Using the observed inputs, run the real data through the model. Look at time 2(or 3, etc.) predicted versus real. Calculate Statistics
  • Case 2 – No repeat measurements
    • Simulation Study
      • Identify matrix of input variables (Site, density, stocking ,etc.) that cover the range of interest.
      • Run model for each row of input matrix
validation patterns and trends
Validation – Patterns and Trends
  • In either case, you will want to look for trends, how the predictions (or residuals if you have real data) change. Examples,
    • MAI over time
    • TPA over time
    • Results vs. predictor variables (Site, treatment, density)
    • How do Long-term predictions compare to “laws”
      • Self-thinning, etc.
    • How does the model prediction compare to other models
in summary
In Summary,
  • Identify the alternative “models”, establish a frame of reference
  • Examine the big picture
  • Look at the sample used in the model calibration, the presumed population, and a sample of “your” population.
  • Identify the key component models. Compare predictions with data – bias and accuracy. Examine these for trends against appropriate factors
  • Look at the overall model output…the computer code. Are there errors? Evaluate output with data (volume per acre – aggregated variables)
final points
Final Points

Remember, there is always an alternative model. When evaluating a model, give careful thought to the alternative. How well a model performs in relation to the alternative is generally the most relevant question.

Validity is relative, as are other things.

slide17

Questions?

All you need in this life is ignorance and confidence -- and then success is sure.

--Mark Twain