Decontamination of Mutual Contamination Models

Julian Katz-Samuels, Gilles Blanchard, Clayton Scott.

Year: 2019, Volume: 20, Issue: 41, Pages: 1−57


Abstract

Many machine learning problems can be characterized by \emph{mutual contamination models}. In these problems, one observes several random samples from different convex combinations of a set of unknown base distributions and the goal is to infer these base distributions. This paper considers the general setting where the base distributions are defined on arbitrary probability spaces. We examine three popular machine learning problems that arise in this general setting: multiclass classification with label noise, demixing of mixed membership models, and classification with partial labels. In each case, we give sufficient conditions for identifiability and present algorithms for the infinite and finite sample settings, with associated performance guarantees.

PDF BibTeX