developing input output simulation metamodels
Skip this Video
Download Presentation
Developing input-output simulation metamodels

Loading in 2 Seconds...

play fullscreen
1 / 40

Developing input-output simulation metamodels - PowerPoint PPT Presentation

  • Uploaded on

Developing input-output simulation metamodels. Ken R. McNaught, M.F. Alam and T.J. Ringrose Cranfield University Defence Academy Shrivenham UK. ORS Defence Study Group Presentation 19/10/2005. What is a metamodel?. A model of a model

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

PowerPoint Slideshow about 'Developing input-output simulation metamodels' - doctor

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
developing input output simulation metamodels

Developing input-output simulation metamodels

Ken R. McNaught, M.F. Alam and T.J. Ringrose

Cranfield University

Defence Academy

Shrivenham UK

ORS Defence Study Group Presentation


what is a metamodel
What is a metamodel?
  • A model of a model
  • In particular, a simulation metamodel is a simpler auxiliary representation of a simulation model.
  • Particularly useful if the simulation model is complex and/or time-consuming to run.
what are metamodels used for
What are metamodels used for?
  • Improved understanding
  • Speed-up, allowing more cases to be considered, i.e. wider exploration. Trade-off with model accuracy.
  • Sensitivity analysis and optimization
  • Regression models
  • Artificial neural networks
  • Kriging
  • Many others
regression metamodels
Regression metamodels
  • The original metamodelling approach and still the most widespread.
  • See, for example :

Friedman (1996), The Simulation Metamodel, Kluwer, Norwell, Mass.

Kleijnen and van Groenendaal (1992), Simulation – A Statistical Perspective, Wiley, Chichester.

regression metamodels ii
Regression Metamodels II
  • If a first order metamodel is assumed, the response variable Y is modelled as

where the are input variables and is the error term.

  • Model parameters estimated using least squares.
  • Procedure supported by many software packages.
regression metamodels iii
Regression Metamodels III
  • A second order regression metamodel also includes quadratic and two-variable interaction terms:
artificial neural networks i
Artificial Neural Networks I
  • Train ANN to ‘learn’ the input-output relationships of the simulation model.
  • Present the ANN with the chosen inputs and resultant simulation outputs from a set of training cases.
  • Multi-layer perceptron ANN ‘learns’ by applying back-propagation algorithm until its outputs converge on the known target values.
  • Over-fitting avoided by use of a validation set of cases not used directly in the model fitting.
  • Training stops when the ANN performance is comparable on both sets of cases.
anns ii
  • Study by Hurrion and Birgil (1999), reported in JORS 50, pp 1018-1033.
  • Developed ANN and regression metamodels of two manufacturing systems.
  • Demonstrated that selecting training cases by simple random sampling of input parameters can lead to better ANN metamodels than standard factorial designs.
experimental design
Experimental Design
  • Experimental design is an important step in metamodelling.
  • Concerned with deciding how the training cases will be selected – what pattern?
  • The input variables are called factors and the output variables are called responses.
  • The value assigned to a factor during a particular run is called its level.
some possible experimental designs
Some Possible Experimental Designs
  • Full factorial
  • Fractional factorial
  • Central composite
  • Simple random sample
  • Latin hypercube sample (LHS)
initial comparison of experimental designs
Initial Comparison of Experimental Designs
  • Previous work reported in Simulation Modelling Practice and Theory 12, pp 559-578
  • Involved comparisons of full factorial, fractional factorial, central composite, random sampling and Latin Hypercube designs
  • ANN metamodels developed from a simple deterministic System Dynamics -based combat model
  • Latin Hypercube design performed best
latin hypercube sampling
Latin Hypercube Sampling
  • Suggested by McKay et al (1979) in Technometrics 21, pp 239-245.
  • Basic idea is to stratify the range of each factor into the same number of non-overlapping intervals. This number is the same as the number of design points or the sample size, .
  • Then a representative value (level) is generated for each interval of each factor by uniform random sampling.
  • Finally, each design point is generated by randomly sampling without replacement a level of each factor. This process is repeated until all design points are generated.
a simple random sample
A simple random sample

Here we have

2 factors, X

and Y and 5

design points



from within the

design space.

a latin hypercube sample
A Latin hypercube sample

Here X and Y have each been split into 5 intervals and each design point chosen by randomly pairing intervals.

and another one
And another one…

Here we have another possible LHS replicate.

screening designs
Screening Designs

The aim of a screening design is to reduce the number of factors included in the main experiment, i.e. to identify and screen out those which are relatively unimportant.

Unimportant in this context means a variable which has little impact on the response variable of interest.

some screening designs
Some Screening Designs
  • Cotter’s method
  • Sequential bifurcation
  • Trocine and Malone’s method
  • Morris’ method
morris method
Morris’ Method
  • See Morris (1991), Technometrics 33, pp 161-174.
  • Idea behind this method is to identify factors with a large main effect, factors with a non-linear / interaction effect and those with neither.
  • The latter can then be screened out.
  • It involves estimating means and variances of response variable differences for each possible factor.
outline of morris method
Outline of Morris’ Method

Starting from a randomly chosen basepoint, in the design space, factor i is selected randomly, increased or decreased by and an elementary effect of that factor calculated as

Let be the new point in the design space

defined above.

Now, a different factor, j, is selected and a new

point defined such


This continues until an elementary effect has been

calculated for each factor.

After an effect has been calculated for each

factor, a new base point is chosen and the

whole process is repeated r times.

Hence, for each factor there are r elementary

effects calculated. Their means and variances

indicate which factors are important.

outline of approach taken in developing metamodel of simbat
Outline of Approach Taken in Developing Metamodel of SIMBAT
  • Initial pre-screening of factors based on intended use of metamodel.
  • Elicitation of bounds for remaining factors based on intended use of metamodel.
  • Formal screening of remaining factors using Morris’ method.
  • Use of Latin hypercube sampling to construct training set for ANN.
  • Development and testing of ANN metamodel.
the scenario
The Scenario
  • The scenario is a meeting engagement between two forces, denoted by Red and Blue, each of approximately battalion size.
  • The scenario involves a large number of both direct/indirect fire parameters and human factors.
  • Within SIMBAT a unit is the main element of a particular scenario. A unit is composed of a number of components, which are the lowest levels represented in SIMBAT, and the direct fire weapon platforms.
  • A component usually represents a vehicle or a grouping of infantry. When a unit has no components left, it is considered ‘dead’ and therefore disappear from the scenario.
factors remaining after pre screening
Factors Remaining After Pre-Screening

These related to the number of components

within various units on either side, and human

factors such as probability of shock,

probability of surprise and unit participation.

Some of these were grouped, e.g. the

number of components in C and D tank squadrons

(both part of the same battlegroup)

factor levels
Factor Levels
  • For the screening design, we employed 6 levels of each factor.
  • For example, the number of tanks in C or D Squadron could be 6, 8, 10, 12, 14 or 16.
  • The number of components in the Recce Company could be 6, 10, 14, 18, 22, 26.
  • The probability of surprise could be 0, 0.2, 0.4, 0.6, 0.8, or 1.
most important factors
Most Important Factors
  • Factor 2: Number of tanks in friendly C or D Squadron
  • Factor 8: Number of tanks in OPFOR companies
  • Factor 10: Number of components in OPFOR Recce Company
  • Factor 13: Unit participation
middle ranking factors
Middle-Ranking Factors
  • Factor 11: Probability of shock
  • Factor 6: Size of Anti-Tank Section
  • Factor 12: Probability of surprise
  • Factor 5: Number of components in friendly Recce Troop
low ranking factors
Low-Ranking Factors

These related to the number of components of

dismounted infantry on each side and the

number of components in each side’s HQ.

Clearly, the rank ordering might have been

different in another scenario.

developing simulation metamodels
Developing simulation metamodels
  • Simulation metamodels were developed using the 8 important factors, derived from Morris’ design
  • Main response of interest was Loss Exchange Ratio,

LER=Red Casualties/Blue Casualties

  • Multi-layer perceptron ANNs considered as potential metamodeling method for the underlying simulation
  • A modified Latin Hypercube design (LHD) was employed to generate the required design points to develop candidate metamodels
  • 3 different approaches for ANN metamodels were considered
approaches employed
  • Approach 1: LER used as the target variable for the ANN
  • Approach 2: Blue casualties and Red casualties used as the target variables in two separate ANNs
  • Approach 3: Blue casualties and Red casualties used as target variables in the same ANN
  • LER arrived at indirectly for the 2nd and 3rd approaches.
approaches cont d
Approaches (cont’d)
  • In each approach, 200 simulation configurations (design points) were used, based on LHS design.
  • A further 200 configurations were randomly sampled - 100 used as a validation set and 100 used as an independent test set.
  • Each configuration incurred 40 replications.
  • Separate ANNs were developed for each approach using the mean results and the full results across all replications (referred to as mean and replication networks).
  • Replication networks usually outperform mean networks.
  • From earlier work, ANNs employing training data generated from the Latin Hypercube design appear to outperform ANNs employing other designs.
  • Separate ANNs not necessary to predict Blue and Red casualties
  • Metamodels have a role to play as simulations become increasingly complex.
  • Regression models, ANNs and other approaches can be used for this purpose – regression is more appropriate if explanation is the goal; ANNs may be more appropriate if prediction is the goal.
  • Experimental design is crucial in metamodel development. Different designs may be required for different approaches.
  • Factor screening may also be required as part of the overall design process if many potential factors are present in the simulation.