ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2005.05570
  4. Cited By
Simultaneous paraphrasing and translation by fine-tuning Transformer
  models

Simultaneous paraphrasing and translation by fine-tuning Transformer models

12 May 2020
Rakesh Chada
ArXiv (abs)PDFHTML

Papers citing "Simultaneous paraphrasing and translation by fine-tuning Transformer models"

2 / 2 papers shown
Monotonic Simultaneous Translation with Chunk-wise Reordering and
  Refinement
Monotonic Simultaneous Translation with Chunk-wise Reordering and Refinement
HyoJung Han
Seokchan Ahn
Yoonjung Choi
Insoo Chung
Sangha Kim
Dong Wang
141
6
0
18 Oct 2021
Chatbot Interaction with Artificial Intelligence: Human Data
  Augmentation with T5 and Language Transformer Ensemble for Text
  Classification
Chatbot Interaction with Artificial Intelligence: Human Data Augmentation with T5 and Language Transformer Ensemble for Text Classification
Jordan J. Bird
Anikó Ekárt
Diego Resende Faria
184
65
0
12 Oct 2020
1