1 / 20

STEADY-STATE SYSTEM SIMULATION(2)

STEADY-STATE SYSTEM SIMULATION(2). REVIEW OF THE BASICS. Initial Transient a.k.a. Warm-Up Period. PROCEDURE. Y(i,j) is the ith sample of the jth replication Confidence interval on {Y(i,*)} Eyeball the diminution of drift (j* is where) Make 3j* the truncation point

lexiss
Download Presentation

STEADY-STATE SYSTEM SIMULATION(2)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. STEADY-STATE SYSTEM SIMULATION(2)

  2. REVIEW OF THE BASICS • Initial Transient a.k.a. Warm-Up Period

  3. PROCEDURE • Y(i,j) is the ith sample of the jth replication • Confidence interval on {Y(i,*)} • Eyeball the diminution of drift (j* is where) • Make 3j* the truncation point • Restart the system for ONE LONG RUN • {Y(i), i>3j*} is a set of autocorrelated, identically distributed data

  4. DEAL WITH AUTOCORRELATION • Batch Means • Regenerative Method • Jackknife • Time Series

  5. BEFORE WE BEGIN That’s a joint distribution function for the whole set of n samples! It captures all of the correlation in the X’s.

  6. WHAT DOES THAT MEAN? • The summation and the integral are interchanged • The joint density function reduces to the marginal distribution for Xi (the correlations are “marginaled out”) • The mean X-bar is unbiased, even when the data has correlation • Unfortunately, when we deal with s2, the squaring function prevents a similar thing, and a naive s2 calculation results in a biased estimate • s2 underestimates s2 when the autocorrelation is positive

  7. BATCH MEANS • {Y(i), 0<i<=n} is the data (3j* already removed) • adjacent BATCHES of size b are formed and the batch average for each is calculated • (regularity conditions) As the batches become large, all correlation between them disappears • Treat the batch means as iid

  8. REGENERATIVE METHOD • Suppose we could define events {T1, T2, ...} where we know that the system is memoryless (by system dynamics) • arrival to empty/idle system • all “clocks” are exponentially distributed • discrete event involving a geometric trial • Samples taken between Ti’s are independent

  9. BUSY PERIOD EXAMPLE Q1=8 Q3=3 Q2=1/3 What is the accumulation rate of queuing time for this system?

  10. SAMPLES • At arrival to an empty queue... • The inter-arrival process is sampled • The service time of the entering customer is sampled • No other activities are happening, no pending events • From the picture our sample is ... • 8/3, (1/3)/1, and 3/2 • which we can treat as iid • note this is not Q-bar/(inter-B)-bar

  11. JACKKNIFE ESTIMATORS • Q-bar/(inter-B)-bar is a biased, consistent estimator • Its expected value is not E[Q/(inter-B)] • As the sample gets large, the bias dimishes to 0 • The bias comes from the dependency of Qi with its accompanying inter-Bi • We care because we want to relax the “memorylessness” property and use a ratio of mean estimates

  12. JACKKNIFE • f’s are biased, consistent estimates • the bias is small • an iid confidence interval of {fg, g=1,2,..n}

  13. TIME SERIES • Also called the Autoregressive Approach • Uses estimates of the coefficients of autocorrelation to create an iid sample with known relationship to m and s • Most well-studied by the statistician community

  14. MECHANICS OF AUTOREGRESSIVE APPROACH • Assume Y’s autocorrelation vanishes after lag p • Create the sample X’s using the b’s • b’s chosen so that X’s have no autocorrelation

  15. RESULT • Let Ri –hat be the sample autocorrelation of lag i • Assume WLOG that b0=1 • Then the b’s solve the system of p equations below:

  16. IN THE LIMIT... Writing • where the JUNK is a term vanishing as n gets large; • where b is the sum of the bs’s, SO ...

  17. so we get the variance we need for our estimate of m

  18. RECIPE • Sample Y’s from the system, Calculate Y-bar • Feel how large p needs to be • Estimate R’s, s=1, 2, ..., p • Solve equation to get b’s, sum them • Create the sample of X’s and estimate sX with sX • Create confidence interval for m

  19. DEAL WITH AUTOCORRELATION • Batch Means • Regenerative Method • Jackknife • Time Series

More Related