Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2307.06029
Cited By
Pluggable Neural Machine Translation Models via Memory-augmented Adapters
12 July 2023
Yuzhuang Xu
Shuo Wang
Peng Li
Xuebo Liu
Xiaolong Wang
Weidong Liu
Yang Liu
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Pluggable Neural Machine Translation Models via Memory-augmented Adapters"
6 / 6 papers shown
Title
An Efficient Memory-Augmented Transformer for Knowledge-Intensive NLP Tasks
Yuxiang Wu
Yu Zhao
Baotian Hu
Pasquale Minervini
Pontus Stenetorp
Sebastian Riedel
RALM
KELM
43
42
0
30 Oct 2022
Non-Parametric Unsupervised Domain Adaptation for Neural Machine Translation
Xin Zheng
Zhirui Zhang
Shujian Huang
Boxing Chen
Jun Xie
Weihua Luo
Jiajun Chen
66
25
0
14 Sep 2021
Efficient Nearest Neighbor Language Models
Junxian He
Graham Neubig
Taylor Berg-Kirkpatrick
RALM
188
103
0
09 Sep 2021
The Bottom-up Evolution of Representations in the Transformer: A Study with Machine Translation and Language Modeling Objectives
Elena Voita
Rico Sennrich
Ivan Titov
182
181
0
03 Sep 2019
Language Models as Knowledge Bases?
Fabio Petroni
Tim Rocktaschel
Patrick Lewis
A. Bakhtin
Yuxiang Wu
Alexander H. Miller
Sebastian Riedel
KELM
AI4MH
393
2,216
0
03 Sep 2019
Convolutional Neural Networks for Sentence Classification
Yoon Kim
AILaw
VLM
238
13,283
0
25 Aug 2014
1