1 / 7

C12: The Poisson process

MATH 3033 based on Dekking et al. A Modern Introduction to Probability and Statistics. 2007 Slides by Gautam Shankar Format by Tim Birbeck Instructor Longin Jan Latecki. C12: The Poisson process. 12.1 – Random Points.

Download Presentation

C12: The Poisson process

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MATH 3033 based onDekking et al. A Modern Introduction to Probability and Statistics. 2007Slides by Gautam ShankarFormat by Tim Birbeck Instructor Longin Jan Latecki C12: The Poisson process

  2. 12.1 – Random Points • Poisson process model: often applies in situations where there is a very large population, and each member of the population has a very small probability to produce a point of the process. • Examples of Random points: arrival times of email messages at a server, the times at which asteroids hit earth, arrival times of radioactive particles at a Geiger counter, times at which your computer crashes, the times at which electronic components fail, and arrival times of people at a pump in an oasis.

  3. 12.2 – Taking a closer look at random arrivals • Example: Telephone calls arrival times • Calls arrive at random times, X1, X2, X3… • Homegeneity aka weak stationarity: is the rate lambda at which arrivals occur in constant over time: in a subinterval of length u the expectation of the number of telephone calls is lambda * u. • Independence: The number of arrivals in disjoint time intervals are independent random variables. • N(I) = total number of calls in an interval I • N([0,t])  Nt • E[Nt] = λ t • Divide Interval [0,t] into n intervals, each of size t/n

  4. 12.2 – Taking a closer look at random arrivals • When n is large enough, every interval Ij,n = ((j-1)t/n , jt/n] will contain either 0 or 1 arrival.Arrival: For such a large n ( n > λ t), Rj = number of arrivals in the time interval Ij,n • Rj has a Ber(pj) distribution for some pj.Recall: (For a Bernoulli random variable)E[Rj] = 0 • (1 – pj) + 1 • pj = pj • By Homogeneity assumption (see prev slide), for each jpj = λ• length of Ij,n = ( λ t / n) • Total number of calls:Nt = R1 + R2 + … + Rn. • By Independence assumption (see prev slide) Rj are independent random variables, so Nt has a Bin(n,p) distribution, with p = λ t/n

  5. 12.2 – Taking a closer look at random arrivals Definition: A discrete random variable X has a Poisson distribution with parameter µ, where µ > 0 if its probability mass function p is given by for k = 0,1,2.. We denote this distribution by Pois(µ) The Expectation an variance of a Poisson Distribution Let X have a Poisson distribution with parameter µ; then E[X] = µ and Var(X) = µ

  6. 12.3 – The one-dimensional Poisson process Interarrival Times The differences Ti = Xi – Xi-1 are called interarrival times.This imples that Therefore T1 has an exponential distribution with parameter λ

  7. 12.3 – The one-dimensional Poisson process T1 and T2 are independent, and The one-dimensional Poisson process with intensity λ is a sequence , , ,.. Of random variables having the property that the inter-arrival times are independent random variables, each with an Exp(λ) distribution. is equal to the number of Xi that are smaller than(or equal to) t.

More Related