AI Seminar

Laplacian Low-Rank Correction: A Generalizable Approach to Incorporating Manifold Regularization Across Applications

Zahi Karam

This work introduces Laplacian low-rank correction (LLRC), a method for manifold regularization that is generally applicable to arbitrary data sets and learning problems, including embedding, clustering, classification and regression. At the core of LLRC is the assumption that off-manifold noise is structured and restricted to a low-dimensional linear subspace. LLRC leverages the graph Laplacian of a local similarity graph to construct a low-rank linear correction that maintains the original dimension of the feature space while eliminating the dominant off-manifold noise directions. In the resultant corrected space the remaining variability represents the data manifold whose principal directions can be used for embedding and upon which clustering, classification and regression can be performed. A primary advantage of such a linear approach is that the correction can be easily extended out-of-sample. To account for non-linear structure LLRC can be kernelized.

Sponsored by