Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2103.17151
Cited By
Divide and Rule: Effective Pre-Training for Context-Aware Multi-Encoder Translation Models
31 March 2021
Lorenzo Lupo
Marco Dinarelli
Laurent Besacier
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Divide and Rule: Effective Pre-Training for Context-Aware Multi-Encoder Translation Models"
7 / 7 papers shown
Title
Investigating Length Issues in Document-level Machine Translation
Ziqian Peng
Rachel Bawden
François Yvon
69
1
0
23 Dec 2024
Importance-Aware Data Augmentation for Document-Level Neural Machine Translation
Ming-Ru Wu
Yufei Wang
George F. Foster
Lizhen Qu
Gholamreza Haffari
35
6
0
27 Jan 2024
End-to-End Single-Channel Speaker-Turn Aware Conversational Speech Translation
Juan Pablo Zuluaga
Zhaocheng Huang
Xing Niu
Rohit Paturi
S. Srinivasan
Prashant Mathur
Brian Thompson
Marcello Federico
BDL
29
2
0
01 Nov 2023
Challenges in Context-Aware Neural Machine Translation
Linghao Jin
Jacqueline He
Jonathan May
Xuezhe Ma
38
7
0
23 May 2023
Encoding Sentence Position in Context-Aware Neural Machine Translation with Concatenation
Lorenzo Lupo
Marco Dinarelli
Laurent Besacier
31
9
0
13 Feb 2023
Focused Concatenation for Context-Aware Neural Machine Translation
Lorenzo Lupo
Marco Dinarelli
Laurent Besacier
19
8
0
24 Oct 2022
Diving Deep into Context-Aware Neural Machine Translation
Jingjing Huo
Christian Herold
Yingbo Gao
Leonard Dahlmann
Shahram Khadivi
Hermann Ney
79
23
0
19 Oct 2020
1