ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1808.07561
  4. Cited By
Training Deeper Neural Machine Translation Models with Transparent
  Attention
v1v2 (latest)

Training Deeper Neural Machine Translation Models with Transparent Attention

22 August 2018
Ankur Bapna
Mengzhao Chen
Orhan Firat
Yuan Cao
Yonghui Wu
ArXiv (abs)PDFHTML

Papers citing "Training Deeper Neural Machine Translation Models with Transparent Attention"

13 / 63 papers shown
Simple, Scalable Adaptation for Neural Machine Translation
Simple, Scalable Adaptation for Neural Machine TranslationConference on Empirical Methods in Natural Language Processing (EMNLP), 2019
Ankur Bapna
N. Arivazhagan
Orhan Firat
AI4CE
327
441
0
18 Sep 2019
Improving Deep Transformer with Depth-Scaled Initialization and Merged
  Attention
Improving Deep Transformer with Depth-Scaled Initialization and Merged AttentionConference on Empirical Methods in Natural Language Processing (EMNLP), 2019
Biao Zhang
Ivan Titov
Rico Sennrich
184
115
0
29 Aug 2019
Massively Multilingual Neural Machine Translation in the Wild: Findings
  and Challenges
Massively Multilingual Neural Machine Translation in the Wild: Findings and Challenges
N. Arivazhagan
Ankur Bapna
Orhan Firat
Dmitry Lepikhin
Melvin Johnson
...
George F. Foster
Colin Cherry
Wolfgang Macherey
Zhiwen Chen
Yonghui Wu
265
449
0
11 Jul 2019
Depth Growing for Neural Machine Translation
Depth Growing for Neural Machine TranslationAnnual Meeting of the Association for Computational Linguistics (ACL), 2019
Lijun Wu
Yiren Wang
Ziheng Lu
Fei Tian
Fei Gao
Tao Qin
Jianhuang Lai
Tie-Yan Liu
153
43
0
03 Jul 2019
Widening the Representation Bottleneck in Neural Machine Translation
  with Lexical Shortcuts
Widening the Representation Bottleneck in Neural Machine Translation with Lexical ShortcutsConference on Machine Translation (WMT), 2019
Denis Emelin
Ivan Titov
Rico Sennrich
141
10
0
28 Jun 2019
Deep Modular Co-Attention Networks for Visual Question Answering
Deep Modular Co-Attention Networks for Visual Question AnsweringComputer Vision and Pattern Recognition (CVPR), 2019
Zhou Yu
Jun Yu
Yuhao Cui
Dacheng Tao
Q. Tian
320
927
0
25 Jun 2019
Robust Neural Machine Translation with Doubly Adversarial Inputs
Robust Neural Machine Translation with Doubly Adversarial InputsAnnual Meeting of the Association for Computational Linguistics (ACL), 2019
Yong Cheng
Lu Jiang
Wolfgang Macherey
AAML
232
267
0
06 Jun 2019
Learning Deep Transformer Models for Machine Translation
Learning Deep Transformer Models for Machine TranslationAnnual Meeting of the Association for Computational Linguistics (ACL), 2019
Qiang Wang
Bei Li
Tong Xiao
Jingbo Zhu
Changliang Li
Yang Li
Lidia S. Chao
226
736
0
05 Jun 2019
Exploiting Sentential Context for Neural Machine Translation
Exploiting Sentential Context for Neural Machine TranslationAnnual Meeting of the Association for Computational Linguistics (ACL), 2019
Xing Wang
Zhaopeng Tu
Longyue Wang
Shuming Shi
112
22
0
04 Jun 2019
Joint Source-Target Self Attention with Locality Constraints
Joint Source-Target Self Attention with Locality Constraints
José A. R. Fonollosa
Noe Casas
Marta R. Costa-jussá
132
23
0
16 May 2019
Very Deep Self-Attention Networks for End-to-End Speech Recognition
Very Deep Self-Attention Networks for End-to-End Speech RecognitionInterspeech (Interspeech), 2019
Ngoc-Quan Pham
T. Nguyen
Jan Niehues
Markus Müller
Sebastian Stüker
A. Waibel
383
168
0
30 Apr 2019
Neutron: An Implementation of the Transformer Translation Model and its
  Variants
Neutron: An Implementation of the Transformer Translation Model and its Variants
Hongfei Xu
Qiuhui Liu
188
19
0
18 Mar 2019
Attention in Natural Language Processing
Attention in Natural Language Processing
Andrea Galassi
Marco Lippi
Paolo Torroni
GNN
437
551
0
04 Feb 2019
Previous
12