Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2106.15078
Cited By
Don't Take It Literally: An Edit-Invariant Sequence Loss for Text Generation
29 June 2021
Guangyi Liu
Zichao Yang
Tianhua Tao
Xiaodan Liang
Junwei Bao
Zhen Li
Bowen Zhou
Shuguang Cui
Zhiting Hu
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Don't Take It Literally: An Edit-Invariant Sequence Loss for Text Generation"
7 / 7 papers shown
Title
Multi-Granularity Optimization for Non-Autoregressive Translation
Yafu Li
Leyang Cui
Yongjing Yin
Yue Zhang
19
6
0
20 Oct 2022
Composable Text Controls in Latent Space with ODEs
Guangyi Liu
Zeyu Feng
Yuan Gao
Zichao Yang
Xiaodan Liang
Junwei Bao
Xiaodong He
Shuguang Cui
Zhen Li
Zhiting Hu
AI4CE
DiffM
13
32
0
01 Aug 2022
Gradient-Based Constrained Sampling from Language Models
Sachin Kumar
Biswajit Paria
Yulia Tsvetkov
BDL
28
53
0
25 May 2022
COLD Decoding: Energy-based Constrained Text Generation with Langevin Dynamics
Lianhui Qin
Sean Welleck
Daniel Khashabi
Yejin Choi
AI4CE
39
144
0
23 Feb 2022
Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation
Yonghui Wu
M. Schuster
Z. Chen
Quoc V. Le
Mohammad Norouzi
...
Alex Rudnick
Oriol Vinyals
G. Corrado
Macduff Hughes
J. Dean
AIMat
716
6,724
0
26 Sep 2016
Deep Reinforcement Learning for Dialogue Generation
Jiwei Li
Will Monroe
Alan Ritter
Michel Galley
Jianfeng Gao
Dan Jurafsky
192
1,325
0
05 Jun 2016
Effective Approaches to Attention-based Neural Machine Translation
Thang Luong
Hieu H. Pham
Christopher D. Manning
214
7,687
0
17 Aug 2015
1