Expanding The Family Of Grassmannian Kernels An Embedding Perspective | Awesome Learning to Hash Add your paper to Learning2Hash

Expanding The Family Of Grassmannian Kernels An Embedding Perspective

Harandi Mehrtash T., Salzmann Mathieu, Jayasumana Sadeep, Hartley Richard, Li Hongdong. Arxiv 2014

[Paper]    
ARXIV Supervised

Modeling videos and image-sets as linear subspaces has proven beneficial for many visual recognition tasks. However, it also incurs challenges arising from the fact that linear subspaces do not obey Euclidean geometry, but lie on a special type of Riemannian manifolds known as Grassmannian. To leverage the techniques developed for Euclidean spaces (e.g, support vector machines) with subspaces, several recent studies have proposed to embed the Grassmannian into a Hilbert space by making use of a positive definite kernel. Unfortunately, only two Grassmannian kernels are known, none of which -as we will show- is universal, which limits their ability to approximate a target function arbitrarily well. Here, we introduce several positive definite Grassmannian kernels, including universal ones, and demonstrate their superiority over previously-known kernels in various tasks, such as classification, clustering, sparse coding and hashing.

Similar Work