Processing math: 47%



Home Page

Papers

Submissions

News

Editorial Board

Special Issues

Open Source Software

Proceedings (PMLR)

Data (DMLR)

Transactions (TMLR)

Search

Statistics

Login

Frequently Asked Questions

Contact Us



RSS Feed

The EM Algorithm is Adaptively-Optimal for Unbalanced Symmetric Gaussian Mixtures

Nir Weinberger, Guy Bresler; 23(103):1−79, 2022.

Abstract

This paper studies the problem of estimating the means ±θRd of a symmetric two-component Gaussian mixture δN(θ,I)+(1δ)N(θ,I), where the weights δ and 1δ are unequal. Assuming that δ is known, we show that the population version of the EM algorithm globally converges if the initial estimate has non-negative inner product with the mean of the larger weight component. This can be achieved by the trivial initialization θ0=0. For the empirical iteration based on n samples, we show that when initialized at θ0=0, the EM algorithm adaptively achieves the minimax error rate ˜O(min in no more than O\Big(\frac{1}{\|\theta_{*}\|(1-2\delta_{*})}\Big) iterations (with high probability). We also consider the EM iteration for estimating the weight \delta_{*}, assuming a fixed mean \theta (which is possibly mismatched to \theta_{*}). For the empirical iteration of n samples, we show that the minimax error rate \tilde{O}\Big(\frac{1}{\|\theta_{*}\|}\sqrt{\frac{d}{n}}\Big) is achieved in no more than O\Big(\frac{1}{\|\theta_{*}\|^{2}}\Big) iterations. These results robustify and complement recent results of Wu and Zhou (2019) obtained for the equal weights case \delta_{*}=1/2.

[abs][pdf][bib]       
© JMLR 2022. (edit, beta)

Mastodon