Support Vector Machines (SVMs) for regression problems are trained by solving
a quadratic optimization problem which needs on the order of
l2
memory and time resources to solve, where
l is the number of training
examples. In this paper, we propose a decomposition algorithm,
SVMTorch1, which is
similar to
SVM-Light
proposed by Joachims [
5] for classification problems,
but adapted to regression problems.
With this algorithm, one can now efficiently solve
large-scale regression problems (more than 20000 examples).
Comparisons with
Nodelib, another publicly available
SVM algorithm for large-scale regression problems
from Flake and Lawrence [
3] yielded significant time improvements.
Finally, based on a recent paper from Lin [
9], we show that a
convergence proof exists for our algorithm.