Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2110.13640
Cited By
s2s-ft: Fine-Tuning Pretrained Transformer Encoders for Sequence-to-Sequence Learning
26 October 2021
Hangbo Bao
Li Dong
Wenhui Wang
Nan Yang
Furu Wei
Re-assign community
ArXiv
PDF
HTML
Papers citing
"s2s-ft: Fine-Tuning Pretrained Transformer Encoders for Sequence-to-Sequence Learning"
5 / 5 papers shown
Title
GanLM: Encoder-Decoder Pre-training with an Auxiliary Discriminator
Jian Yang
Shuming Ma
Li Dong
Shaohan Huang
Haoyang Huang
Yuwei Yin
Dongdong Zhang
Liqun Yang
Furu Wei
Zhoujun Li
SyDa
AI4CE
32
25
0
20 Dec 2022
Image as a Foreign Language: BEiT Pretraining for All Vision and Vision-Language Tasks
Wenhui Wang
Hangbo Bao
Li Dong
Johan Bjorck
Zhiliang Peng
...
Kriti Aggarwal
O. Mohammed
Saksham Singhal
Subhojit Som
Furu Wei
MLLM
VLM
ViT
49
629
0
22 Aug 2022
Exploiting Global and Local Hierarchies for Hierarchical Text Classification
Ting Jiang
Deqing Wang
Leilei Sun
Zhongzhi Chen
Fuzhen Zhuang
Qinghong Yang
8
28
0
05 May 2022
Enhance Incomplete Utterance Restoration by Joint Learning Token Extraction and Text Generation
Shumpei Inoue
Tsun-Jui Liu
Nguyen Hong Son
Minh Le Nguyen
33
17
0
08 Apr 2022
Text Summarization with Pretrained Encoders
Yang Liu
Mirella Lapata
MILM
254
1,430
0
22 Aug 2019
1