A gentle introduction to Gaussian distribution. X = 0. X = 1. X: Random variable. Review. Random variable Coin flip experiment. P(x). P(x) >= 0. 0. 1. x. Review. Probability mass function (discrete). Any other constraints? Hint: What is the sum?. Example: Coin flip experiment.
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
The distribution of the sum of N i.i.d. random variables becomes increasingly Gaussian as N grows.
Example: N uniform [0,1] random variables.
NH = x1 + x2 + …. + xN
N = 5
N = 10
N = 40
thanks to anti-symmetry of z
Given i.i.d. data , the log likelihood function is given by
Set the derivative of the log likelihood function to zero,
and solve to obtain
Mixture of two Gaussians
Old Faithful data set
Combine simple models into a complex model:
Log of a sum; no closed form maximum.
Determining parameters ¹, §, and ¼ using maximum log likelihood
Solution: use standard, iterative, numeric optimization methods or the expectation maximization algorithm (Chapter 9).