Discrete Latent Factor Model For Cross-modal Hashing | Awesome Learning to Hash Add your paper to Learning2Hash

Discrete Latent Factor Model For Cross-modal Hashing

Qing-Yuan Jiang, Wu-Jun Li . Arxiv 2017 – 1 citation

[Paper]   Search on Google Scholar   Search on Semantic Scholar
Datasets Efficiency Hashing Methods Similarity Search

Due to its storage and retrieval efficiency, cross-modal hashing~(CMH) has been widely used for cross-modal similarity search in multimedia applications. According to the training strategy, existing CMH methods can be mainly divided into two categories: relaxation-based continuous methods and discrete methods. In general, the training of relaxation-based continuous methods is faster than discrete methods, but the accuracy of relaxation-based continuous methods is not satisfactory. On the contrary, the accuracy of discrete methods is typically better than relaxation-based continuous methods, but the training of discrete methods is time-consuming. In this paper, we propose a novel CMH method, called discrete latent factor model based cross-modal hashing~(DLFH), for cross modal similarity search. DLFH is a discrete method which can directly learn the binary hash codes for CMH. At the same time, the training of DLFH is efficient. Experiments on real datasets show that DLFH can achieve significantly better accuracy than existing methods, and the training time of DLFH is comparable to that of relaxation-based continuous methods which are much faster than existing discrete methods.

Similar Work