CS Forum Talk: Learning with Cross-Kernel Matrices and Ideal PCA: Dr. Franz Kiraly University College London, UK
When: 11.5.2015 at 13:15-14:00 Where: Lecture hall T2, CS Building, Speaker: Dr. Franz Kiraly, University College London, UK
Abstract:
We describe how cross-kernel matrices - that is, kernel matrices
between the data and a custom chosen set of `inducing points' - can be
used for general non-linear learning tasks.
The main potential of cross-kernel matrices is that: (a) they provide
Nystrom-type speed-ups for kernel learning without relying on
subsampling, thus avoiding potential problems with sampling
degeneracy; as we show, cross-kernel learners preserve the usual
approximation guarantees and the attractive linear scaling of standard
Nystrom methods. (b) the use of non-square matrices for kernel
learning provides a non-linear generalization of the singular value
decomposition and singular features.
We present a novel algorithm, Ideal PCA (IPCA), which can be seen as a
kernel variant of the singular value decomposition, that showcases
both advantages. We demonstrate on real and synthetic data that IPCA
allows to (a) obtain kernel PCA-like features faster and (b) to
extract novel features of empirical advantage in unsupervised manifold
learning and supervised classification.
Host: Louis Theran