ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2305.13034
13
2

Nearest Neighbor Machine Translation is Meta-Optimizer on Output Projection Layer

22 May 2023
R. Gao
Zhirui Zhang
Yichao Du
Lemao Liu
Rui Wang
ArXivPDFHTML
Abstract

Nearest Neighbor Machine Translation (kkkNN-MT) has achieved great success in domain adaptation tasks by integrating pre-trained Neural Machine Translation (NMT) models with domain-specific token-level retrieval. However, the reasons underlying its success have not been thoroughly investigated. In this paper, we comprehensively analyze kkkNN-MT through theoretical and empirical studies. Initially, we provide new insights into the working mechanism of kkkNN-MT as an efficient technique to implicitly execute gradient descent on the output projection layer of NMT, indicating that it is a specific case of model fine-tuning. Subsequently, we conduct multi-domain experiments and word-level analysis to examine the differences in performance between kkkNN-MT and entire-model fine-tuning. Our findings suggest that: (1) Incorporating kkkNN-MT with adapters yields comparable translation performance to fine-tuning on in-domain test sets, while achieving better performance on out-of-domain test sets; (2) Fine-tuning significantly outperforms kkkNN-MT on the recall of in-domain low-frequency words, but this gap could be bridged by optimizing the context representations with additional adapter layers.

View on arXiv
Comments on this paper