1 / 6

Enhanced EM Algorithm for Global Optimality in Gaussian Mixture Models

The Enhanced EM (EEM) algorithm improves the traditional EM algorithm for Gaussian mixture models (GMM) by addressing convergence issues. It utilizes a uniform distribution for initialization to ensure global optimality, resulting in repeatable solutions. The algorithm also incorporates perturbation techniques to avoid singularities in computations, enhancing stability. By integrating histograms as input, the EEM algorithm demonstrates superior performance in extensive experiments, overcoming common pitfalls of local maxima and singularity problems typical in conventional EM strategies.

eara
Download Presentation

Enhanced EM Algorithm for Global Optimality in Gaussian Mixture Models

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Enhanced EM (EEM) Algorithm G. R. Xuan1, Y. Q. Shi2, P. Chai1, P. Sutthiwan2 1Tongji University, Shanghai China 2NJIT, New Jersey, USA ICPR2012

  2. Conventional EM algorithm EM algorithm: a powerful tool for Gaussian mixture model (GMM) unsupervised learning [Dempster et al. 77] Its convergence has been mathematically proved. However, it may converge to local maximum. It may suffer from occasional singularity.

  3. First novelty The uniform distribution (hence maximum entropy) has been proposed as the initial condition. • The EEM algorithm can achieve the global optimality in our extensive experimental works. • If a particular uniform distribution is used as the initial condition, the solution is global optimum, and repeatable.

  4. Second novelty • Singularity avoidance by using perturbation • That is, for those possible singularity, e.g., 1/x, or log(x), or C-1,we propose to add a positive small-value, ε, to avoid singularity. • Normally ε=10-20 • For example, we use: • 1/(x+ε) • Log(x+ε) • (C+εI)-1

  5. Others • Histogram is used as input. • G. Xuan et al. “EM algortihm of Gaussian mixture model and hidden Markov model,” ICIP2001. • Performance in GMM is good.

  6. Note With fixed sharp Gaussian distribution initialization (five differnet solutions are obtained) With non-fixed (stochastically selected) uniform distribution initialization occasionally resulting in possible multiple solutions. With a fixeduniform distribution as initialization the solution is optimal and repeatable.

More Related