Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2212.13005
Cited By
TextBox 2.0: A Text Generation Library with Pre-trained Language Models
26 December 2022
Tianyi Tang
Junyi Li
Z. Chen
Yiwen Hu
Zhuohao Yu
Wen-Dao Dai
Zican Dong
Xiaoxue Cheng
Yuhao Wang
Wayne Xin Zhao
J. Nie
Ji-Rong Wen
Re-assign community
ArXiv
PDF
HTML
Papers citing
"TextBox 2.0: A Text Generation Library with Pre-trained Language Models"
5 / 5 papers shown
Title
Training Dynamics for Text Summarization Models
Tanya Goyal
Jiacheng Xu
J. Li
Greg Durrett
57
29
0
15 Oct 2021
CPT: A Pre-Trained Unbalanced Transformer for Both Chinese Language Understanding and Generation
Yunfan Shao
Zhichao Geng
Yitao Liu
Junqi Dai
Hang Yan
Fei Yang
Li Zhe
Hujun Bao
Xipeng Qiu
MedIm
59
146
0
13 Sep 2021
Thank you BART! Rewarding Pre-Trained Models Improves Formality Style Transfer
Huiyuan Lai
Antonio Toral
Malvina Nissim
27
56
0
14 May 2021
The GEM Benchmark: Natural Language Generation, its Evaluation and Metrics
Sebastian Gehrmann
Tosin P. Adewumi
Karmanya Aggarwal
Pawan Sasanka Ammanamanchi
Aremu Anuoluwapo
...
Nishant Subramani
Wei-ping Xu
Diyi Yang
Akhila Yerukola
Jiawei Zhou
VLM
246
283
0
02 Feb 2021
OpenNMT: Neural Machine Translation Toolkit
Guillaume Klein
Yoon Kim
Yuntian Deng
Vincent Nguyen
Jean Senellart
Alexander M. Rush
144
119
0
28 May 2018
1