ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2109.01396
  4. Cited By
Language Modeling, Lexical Translation, Reordering: The Training Process
  of NMT through the Lens of Classical SMT

Language Modeling, Lexical Translation, Reordering: The Training Process of NMT through the Lens of Classical SMT

3 September 2021
Elena Voita
Rico Sennrich
Ivan Titov
ArXiv (abs)PDFHTML

Papers citing "Language Modeling, Lexical Translation, Reordering: The Training Process of NMT through the Lens of Classical SMT"

13 / 13 papers shown
Polysemy of Synthetic Neurons Towards a New Type of Explanatory Categorical Vector Spaces
Polysemy of Synthetic Neurons Towards a New Type of Explanatory Categorical Vector Spaces
Michael Pichat
William Pogrund
Paloma Pichat
Judicael Poumay
Armanouche Gasparian
Samuel Demarchi
Martin Corbet
Alois Georgeon
Michael Veillet-Guillem
MILM
287
0
0
30 Apr 2025
Intra-neuronal attention within language models Relationships between activation and semantics
Intra-neuronal attention within language models Relationships between activation and semantics
Michael Pichat
William Pogrund
Paloma Pichat
Armanouche Gasparian
Samuel Demarchi
Corbet Alois Georgeon
Michael Veillet-Guillem
MILM
256
0
0
17 Mar 2025
A kinetic-based regularization method for data science applications
A kinetic-based regularization method for data science applications
Abhisek Ganguly
Alessandro Gabbana
Vybhav Rao
Sauro Succi
Santosh Ansumali
361
5
0
06 Mar 2025
How Do Artificial Intelligences Think? The Three Mathematico-Cognitive Factors of Categorical Segmentation Operated by Synthetic Neurons
How Do Artificial Intelligences Think? The Three Mathematico-Cognitive Factors of Categorical Segmentation Operated by Synthetic Neurons
Michael Pichat
William Pogrund
Armanush Gasparian
Paloma Pichat
Samuel Demarchi
Michael Veillet-Guillem
276
3
0
26 Dec 2024
Representations as Language: An Information-Theoretic Framework for
  Interpretability
Representations as Language: An Information-Theoretic Framework for Interpretability
Henry Conklin
Kenny Smith
MILM
155
3
0
04 Jun 2024
What Have We Achieved on Non-autoregressive Translation?
What Have We Achieved on Non-autoregressive Translation?
Yafu Li
Huajian Zhang
Jianhao Yan
Yongjing Yin
Yue Zhang
312
2
0
21 May 2024
DecoderLens: Layerwise Interpretation of Encoder-Decoder Transformers
DecoderLens: Layerwise Interpretation of Encoder-Decoder Transformers
Anna Langedijk
Hosein Mohebbi
Gabriele Sarti
Willem H. Zuidema
Jaap Jumelet
242
15
0
05 Oct 2023
Are Character-level Translations Worth the Wait? Comparing ByT5 and mT5
  for Machine Translation
Are Character-level Translations Worth the Wait? Comparing ByT5 and mT5 for Machine TranslationTransactions of the Association for Computational Linguistics (TACL), 2023
Lukas Edman
Gabriele Sarti
Antonio Toral
Gertjan van Noord
Arianna Bisazza
235
17
0
28 Feb 2023
Inseq: An Interpretability Toolkit for Sequence Generation Models
Inseq: An Interpretability Toolkit for Sequence Generation ModelsAnnual Meeting of the Association for Computational Linguistics (ACL), 2023
Gabriele Sarti
Nils Feldhus
Ludwig Sickert
Oskar van der Wal
Malvina Nissim
Arianna Bisazza
316
90
0
27 Feb 2023
Learning a Formality-Aware Japanese Sentence Representation
Learning a Formality-Aware Japanese Sentence Representation
Lin Zhang
Ray Lee
Jerry Chen
Kelly Marchisio
CVBMAI4TS
133
0
0
17 Jan 2023
Towards Opening the Black Box of Neural Machine Translation: Source and
  Target Interpretations of the Transformer
Towards Opening the Black Box of Neural Machine Translation: Source and Target Interpretations of the TransformerConference on Empirical Methods in Natural Language Processing (EMNLP), 2022
Javier Ferrando
Gerard I. Gállego
Belen Alastruey
Carlos Escolano
Marta R. Costa-jussá
335
53
0
23 May 2022
Improving Neural Machine Translation by Denoising Training
Improving Neural Machine Translation by Denoising Training
Liang Ding
Keqin Peng
Dacheng Tao
VLMAI4CE
197
6
0
19 Jan 2022
Can Multilinguality benefit Non-autoregressive Machine Translation?
Can Multilinguality benefit Non-autoregressive Machine Translation?
Sweta Agrawal
Julia Kreutzer
Colin Cherry
AI4CE
180
1
0
16 Dec 2021
1