ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1901.01590
  4. Cited By
Improving Unsupervised Word-by-Word Translation with Language Model and
  Denoising Autoencoder

Improving Unsupervised Word-by-Word Translation with Language Model and Denoising Autoencoder

6 January 2019
Yunsu Kim
Jiahui Geng
Hermann Ney
ArXivPDFHTML

Papers citing "Improving Unsupervised Word-by-Word Translation with Language Model and Denoising Autoencoder"

10 / 10 papers shown
Title
Mitigating Data Imbalance and Representation Degeneration in
  Multilingual Machine Translation
Mitigating Data Imbalance and Representation Degeneration in Multilingual Machine Translation
Wen Lai
Alexandra Chronopoulou
Alexander Fraser
40
5
0
22 May 2023
PEACH: Pre-Training Sequence-to-Sequence Multilingual Models for
  Translation with Semi-Supervised Pseudo-Parallel Document Generation
PEACH: Pre-Training Sequence-to-Sequence Multilingual Models for Translation with Semi-Supervised Pseudo-Parallel Document Generation
Alireza Salemi
Amirhossein Abaskohi
Sara Tavakoli
Yadollah Yaghoobzadeh
A. Shakery
AIMat
32
0
0
03 Apr 2023
Is Encoder-Decoder Redundant for Neural Machine Translation?
Is Encoder-Decoder Redundant for Neural Machine Translation?
Yingbo Gao
Christian Herold
Zijian Yang
Hermann Ney
27
4
0
21 Oct 2022
Sub-Word Alignment Is Still Useful: A Vest-Pocket Method for Enhancing
  Low-Resource Machine Translation
Sub-Word Alignment Is Still Useful: A Vest-Pocket Method for Enhancing Low-Resource Machine Translation
Minhan Xu
Yu Hong
18
6
0
09 May 2022
When and Why is Unsupervised Neural Machine Translation Useless?
When and Why is Unsupervised Neural Machine Translation Useless?
Yunsu Kim
Miguel Graça
Hermann Ney
SSL
25
70
0
22 Apr 2020
A Study of Cross-Lingual Ability and Language-specific Information in
  Multilingual BERT
A Study of Cross-Lingual Ability and Language-specific Information in Multilingual BERT
Chi-Liang Liu
Tsung-Yuan Hsu
Yung-Sung Chuang
Hung-yi Lee
34
14
0
20 Apr 2020
Denoising based Sequence-to-Sequence Pre-training for Text Generation
Denoising based Sequence-to-Sequence Pre-training for Text Generation
Liang Wang
Wei Zhao
Ruoyu Jia
Sujian Li
Jingming Liu
VLM
AI4CE
42
37
0
22 Aug 2019
Effective Cross-lingual Transfer of Neural Machine Translation Models
  without Shared Vocabularies
Effective Cross-lingual Transfer of Neural Machine Translation Models without Shared Vocabularies
Yunsu Kim
Yingbo Gao
Hermann Ney
VLM
24
88
0
14 May 2019
Word Translation Without Parallel Data
Word Translation Without Parallel Data
Alexis Conneau
Guillaume Lample
MarcÁurelio Ranzato
Ludovic Denoyer
Hervé Jégou
189
1,639
0
11 Oct 2017
Six Challenges for Neural Machine Translation
Six Challenges for Neural Machine Translation
Philipp Koehn
Rebecca Knowles
AAML
AIMat
224
1,211
0
12 Jun 2017
1