1 / 15

Some standard univariate probability distributions

Some standard univariate probability distributions. Characteristic function, moment generating function, cumulant generating functions Discrete distribution Continuous distributions Some distributions associated with normal References.

cora
Download Presentation

Some standard univariate probability distributions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Some standard univariate probability distributions • Characteristic function, moment generating function, cumulant generating functions • Discrete distribution • Continuous distributions • Some distributions associated with normal • References

  2. Characteristic function, moment generating function, cumulant generating functions Characteristic function is defined as expectation of the function - e(itx) Moment generating function is defined as (expectation of e(tx)): Moments can be calculated in the following way. Obtain derivative of M(t) and take the value of it at t=0 Cumulant generting function is defined as logarithm of the characteristic function

  3. Discrete distributions: Binomial Let us assume that we carry out experiment and the result of the experiment can be “success” or “failure”. The probability of “success” in one experiment is p. Then probability of failure is q=1-p. We carry out experiments n times. Distribution of k successes is binomial: Characteristic function: Moment generating function: Find the first and the second moments.

  4. Discrete distributions: Poisson When number of the trials (n) is large and the probability of successes (p) is small and np is finite and tends to  then the binomial distribution converges to Poisson distribution: Poisson distribution is used to describe the distribution of an event that occurs rarely (rare events) in a short time period. It is used in counting statistics to describe the number of registered photons. Characteristic function is: What is the first moment?

  5. Discrete distributions: Negative Binomial Consider an experiment: Probability of “success” is p and probability of failure is q=1-p. We carry out experiment until k-th success. We want to find the probability of j failures. (It is called sequential sampling. Sampling is carried out until stopping rule - k successes - is satisfied). If we have j failures then it means that the number of trials is k+j. Last trial was success. Then the probability that we will have j failures is: It is called negative binomial because coefficients have the from of negative binomial series: p-k=(1-q)-k Characteristic function is: What is the moment generating function? What is the first moment?

  6. Continuous distributions: uniform The simplest form of the continuous distribution is the uniform with density: Cumulative distribution function is: Moments and other properties are calculated easily.

  7. Continuous distributions: exponential Density of exponential distribution has the form: This distribution has two origins. • Maximum entropy. If we know that random variable is non-negative and we know its first moment – 1/ then maximum entropy distribution has the exponential form. • From Poisson type random processes. If probability distribution of j(t) events occurring during time interval [0;t) is a Poisson with mean value t then probability of time elapsing till the first event occurs has the exponential distribution. Let Tr denotes time elapsed untilr-th event Putting r=1 we get e(- t). Taking into account that P(T1>t) = 1-F1(t) and getting its derivative wrt t we arrive to exponential distribution This distribution together with Poisson is widely used in reliability studies, life testing etc.

  8. Continuous distributions: Gamma Gamma distribution can be considered as generalisation of the exponential distribution. It has the form: It is probability of time t elapsing before r events happens Characteristic function of this distribution is: If there are r independently and identically exponentially distributed random variables then the distribution of their sum is Gamma.

  9. Continuous distributions: Normal Perhaps the most popular and widely used continuous distribution is the normal distribution. Main reason for this is that that usually random variable is the sum of the many random variables. According to central limit theorem under some conditions (for example: random variables are independent. first and second and third moments exist and finite then distribution of sum of random variables converges to normal distribution) Density of the normal distribution has the form There many tables for normal distribution. Its characteristic function is:

  10. Central limit theorem Let assume that we have n independent random variables {Xi}, i= 1,..,n. If first, second and third moments (this condition can be relaxed) are finite then the sum of these random variables for sufficiently large n will be approximately normally distributed. Because of this theorem in many cases assumption of normal distribution is sufficiently good and tests based on this assumption give satisfactory results. Sometimes statements are made that:

  11. Exponential family Exponential family of distributions has the form Many distributions are special case of this family. Natural exponential family of distributions is the subclass of this family: Where A() is natural parameter. If we use the fact that distribution should be normalised then characteristic function of the natural exponential family with natural parameter A() =  can be derived to be: Try to derive it. Hint: use the normalisation factor. Find D and then use expression of characteristic function and D. This distribution is used for fitting generlised linear models.

  12. Exponential family: Examples Many well known distributions belong to this family (All distributions mentioned in this lecture are from exponential family). Binomial Poisson Gamma Normal

  13. Continuous distributions: 2 Normal variables are called standardized if their mean is 0 and variance is 1. Sum of n standardized normal random variables is 2 with n degrees of freedom. Density function is: If there are p linear restraints on the random variables then degree of freedom becomes n-p. Characteristic function for this distribution is: 2is used widely in statistics for such tests as goodness of fit of model to experiment.

  14. Continuous distributions: t and F-distributions Two more distribution is closely related with normal distribution. We will give them when we will discuss sample and sampling distributions. One of them is Student’s t-distribution. It is used to test if mean value of the sample is significantly different from 0. Another and similar application is for tests of differences of means of two different samples are different. Fisher’s F-distribution is distribution ratio of the variances of two different samples. It is used to test if their variances are different. On of the important application is in ANOVA.

  15. Reference Johnson, N.L. & Kotz, S. (1969, 1970, 1972) Distributions in Statistics, I: Discrete distributions; II, III: Continuous univariate distributions, IV: Continuous multivariate distributions. Houghton Mufflin, New York. Mardia, K.V. & Jupp, P.E. (2000) Directional Statistics, John Wiley & Sons. Jaynes, E (2003) The Probability theory: Logic of Science

More Related