Simplified Tinybert: Knowledge Distillation For Document Retrieval | Awesome Learning to Hash Add your paper to Learning2Hash

Simplified Tinybert: Knowledge Distillation For Document Retrieval

Xuanang Chen, Ben He, Kai Hui, Le Sun, Yingfei Sun . Arxiv 2020 – 7 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
Efficiency Evaluation Text Retrieval

Despite the effectiveness of utilizing the BERT model for document ranking, the high computational cost of such approaches limits their uses. To this end, this paper first empirically investigates the effectiveness of two knowledge distillation models on the document ranking task. In addition, on top of the recently proposed TinyBERT model, two simplifications are proposed. Evaluations on two different and widely-used benchmarks demonstrate that Simplified TinyBERT with the proposed simplifications not only boosts TinyBERT, but also significantly outperforms BERT-Base when providing 15(\times) speedup.

Similar Work