M. Chi, H. He & W.
Zhang; JMLR W&CP 20:33–46, 2011.
Nonlinear Online Classiﬁcation Algorithm with Probability Margin
Usually, it is necessary for nonlinear online learning algorithms to store a set of
misclassiﬁed observed examples for computing kernel values. For large-scale problems, this is not
only time consuming but leads also to an out-of-memory problem. In the paper, a nonlinear online
classiﬁcation algorithm is proposed with a probability margin to address the problem. In
particular, the discriminant function is deﬁned by the Gaussian mixture model with the
statistical information of all the observed examples instead of data points. Then, the learnt
model is used to train a nonlinear online classiﬁcation algorithm with conﬁdence such
that the corresponding margin is deﬁned by probability. When doing so, the internal
memory is signiﬁcantly reduced while the classiﬁcation performance is kept. Also, we
prove mistake bounds in terms of the generative model. Experiments carried out on one
synthesis and two real large-scale data sets validate the eﬀectiveness of the proposed
Page last modified on Sun Nov 6 15:42:08 2011.