The EM Algorithm is Adaptively-Optimal for Unbalanced Symmetric Gaussian Mixtures

Nir Weinberger, Guy Bresler

Research output: Contribution to journalArticlepeer-review

Abstract

This paper studies the problem of estimating the means ± θ2 Rd of a symmetric twocomponent Gaussian mixture δN(θ; I) + (1 - δ) N(-θ; I), where the weights δand 1 - δare unequal. Assuming that δis known, we show that the population version of the EM algorithm globally converges if the initial estimate has non-negative inner product with the mean of the larger weight component. This can be achieved by the trivial initialization θ0 = 0. For the empirical iteration based on n samples, we show that when initialized at θ0 = 0, the EM algorithm adaptively achieves the minimax error rate O ( min n 1 (1-2δ) q d n; 1 kθk q d n; - d n )1/4 }) in no more than O ( 1 ∥ θ∥ (1-2δ) ) iterations (with high probability). We also consider the EM iteration for estimating the weight δ, assuming a fixed mean θ (which is possibly mismatched to θ). For the empirical iteration of n samples, we show that the minimax error rate ∼O ( 1 ∥ θ∥ q d n ) is achieved in no more than O ( 1 ∥θ∥2 ) iterations. These results robustify and complement recent results of Wu and Zhou (2019) obtained for the equal weights case δ= 1/2.

Original languageEnglish
JournalJournal of Machine Learning Research
Volume23
StatePublished - 2022

Keywords

  • Gaussian mixtures
  • expectation-maximization
  • finite-sample guarantees
  • global convergence
  • parameter estimation

All Science Journal Classification (ASJC) codes

  • Control and Systems Engineering
  • Software
  • Statistics and Probability
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'The EM Algorithm is Adaptively-Optimal for Unbalanced Symmetric Gaussian Mixtures'. Together they form a unique fingerprint.

Cite this