Learning over Sets using Kernel Principal Angles
Lior Wolf, Amnon Shashua; 4(Oct):913-931, 2003.
Abstract
We consider the problem of learning with instances defined over a space of sets of vectors. We derive a new positive definite kernel f(A,B) defined over pairs of matrices A,B based on the concept of principal angles between two linear subspaces. We show that the principal angles can be recovered using only inner-products between pairs of column vectors of the input matrices thereby allowing the original column vectors of A,B to be mapped onto arbitrarily high-dimensional feature spaces.
We demonstrate the usage of the matrix-based kernel function f(A,B) with experiments on two visual tasks. The first task is the discrimination of "irregular" motion trajectory of an individual or a group of individuals in a video sequence. We use the SVM approach using f(A,B) where an input matrix represents the motion trajectory of a group of individuals over a certain (fixed) time frame. We show that the classification (irregular versus regular) greatly outperforms the conventional representation where all the trajectories form a single vector. The second application is the visual recognition of faces from input video sequences representing head motion and facial expressions where f(A,B) is used to compare two image sequences.
[abs]
[pdf][ps.gz][ps]