S2SD: Simultaneous Similarity-based Self-distillation For Deep Metric Learning | Awesome Learning to Hash Add your paper to Learning2Hash

S2SD: Simultaneous Similarity-based Self-distillation For Deep Metric Learning

Karsten Roth, Timo Milbich, BjΓΆrn Ommer, Joseph Paul Cohen, Marzyeh Ghassemi . Arxiv 2020 – 4 citations

[Code] [Paper]   Search on Google Scholar   Search on Semantic Scholar
Distance Metric Learning Evaluation Few Shot & Zero Shot

Deep Metric Learning (DML) provides a crucial tool for visual similarity and zero-shot applications by learning generalizing embedding spaces, although recent work in DML has shown strong performance saturation across training objectives. However, generalization capacity is known to scale with the embedding space dimensionality. Unfortunately, high dimensional embeddings also create higher retrieval cost for downstream applications. To remedy this, we propose \emph{Simultaneous Similarity-based Self-distillation (S2SD). S2SD extends DML with knowledge distillation from auxiliary, high-dimensional embedding and feature spaces to leverage complementary context during training while retaining test-time cost and with negligible changes to the training time. Experiments and ablations across different objectives and standard benchmarks show S2SD offers notable improvements of up to 7% in Recall@1, while also setting a new state-of-the-art. Code available at https://github.com/MLforHealth/S2SD.

Similar Work