Nearest Neighbor Machine Translation
Khandelwal Urvashi, Fan Angela, Jurafsky Dan, Zettlemoyer Luke, Lewis Mike. Arxiv 2020
[Paper]
ARXIV
We introduce -nearest-neighbor machine translation (NN-MT), which
predicts tokens with a nearest neighbor classifier over a large datastore of
cached examples, using representations from a neural translation model for
similarity search. This approach requires no additional training and scales to
give the decoder direct access to billions of examples at test time, resulting
in a highly expressive model that consistently improves performance across many
settings. Simply adding nearest neighbor search improves a state-of-the-art
German-English translation model by 1.5 BLEU. NN-MT allows a single model to
be adapted to diverse domains by using a domain-specific datastore, improving
results by an average of 9.2 BLEU over zero-shot transfer, and achieving new
state-of-the-art results – without training on these domains. A massively
multilingual model can also be specialized for particular language pairs, with
improvements of 3 BLEU for translating from English into German and Chinese.
Qualitatively, NN-MT is easily interpretable; it combines source and target
context to retrieve highly relevant examples.
Similar Work