Where Are We In Embedding Spaces? A Comprehensive Analysis On Network Embedding Approaches For Recommender Systems | Awesome Learning to Hash Add your paper to Learning2Hash

Where Are We In Embedding Spaces? A Comprehensive Analysis On Network Embedding Approaches For Recommender Systems

Sixiao Zhang, Hongxu Chen, Xiao Ming, Lizhen Cui, Hongzhi Yin, Guandong Xu . Arxiv 2021 – 6 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
Datasets Distance Metric Learning Evaluation Recommender Systems

Hyperbolic space and hyperbolic embeddings are becoming a popular research field for recommender systems. However, it is not clear under what circumstances the hyperbolic space should be considered. To fill this gap, This paper provides theoretical analysis and empirical results on when and where to use hyperbolic space and hyperbolic embeddings in recommender systems. Specifically, we answer the questions that which type of models and datasets are more suited for hyperbolic space, as well as which latent size to choose. We evaluate our answers by comparing the performance of Euclidean space and hyperbolic space on different latent space models in both general item recommendation domain and social recommendation domain, with 6 widely used datasets and different latent sizes. Additionally, we propose a new metric learning based recommendation method called SCML and its hyperbolic version HSCML. We evaluate our conclusions regarding hyperbolic space on SCML and show the state-of-the-art performance of hyperbolic space by comparing HSCML with other baseline methods.

Similar Work