A gentle introduction to Gaussian distribution. X = 0. X = 1. X: Random variable. Review. Random variable Coin flip experiment. P(x). P(x) >= 0. 0. 1. x. Review. Probability mass function (discrete). Any other constraints? Hint: What is the sum?. Example: Coin flip experiment.
The distribution of the sum of N i.i.d. random variables becomes increasingly Gaussian as N grows.
Example: N uniform [0,1] random variables.
NH = x1 + x2 + …. + xN
N = 5
N = 10
N = 40
thanks to anti-symmetry of z
Given i.i.d. data , the log likelihood function is given by
Set the derivative of the log likelihood function to zero,
and solve to obtain
Mixture of two Gaussians
Old Faithful data set
Combine simple models into a complex model:
Log of a sum; no closed form maximum.
Determining parameters ¹, §, and ¼ using maximum log likelihood
Solution: use standard, iterative, numeric optimization methods or the expectation maximization algorithm (Chapter 9).