On Robustness Properties of Convex Risk Minimization Methods for Pattern Recognition
Andreas Christmann, Ingo Steinwart; 5(Aug):1007--1034, 2004.
Abstract
The paper brings together methods from two disciplines:
machine learning theory and robust statistics.
We argue that robustness is an important aspect and
we show that many existing machine learning methods
based on the convex risk minimization principle have
- besides other good properties - also the advantage
of being robust. Robustness properties of machine learning
methods based on convex risk minimization are investigated
for the problem of pattern recognition. Assumptions are
given for the existence of the influence function of the
classifiers and for bounds on the influence function.
Kernel logistic regression, support vector machines, least
squares and the AdaBoost loss function are treated as
special cases. Some results on the robustness of such methods
are also obtained for the sensitivity curve and the maxbias,
which are two other robustness criteria. A sensitivity
analysis of the support vector machine is given.
[abs][pdf][ps.gz][ps]