Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2002.10345
Cited By
Improving BERT Fine-Tuning via Self-Ensemble and Self-Distillation
24 February 2020
Yige Xu
Xipeng Qiu
L. Zhou
Xuanjing Huang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Improving BERT Fine-Tuning via Self-Ensemble and Self-Distillation"
6 / 6 papers shown
Title
Multi-CLS BERT: An Efficient Alternative to Traditional Ensembling
Haw-Shiuan Chang
Ruei-Yao Sun
Kathryn Ricci
Andrew McCallum
27
14
0
10 Oct 2022
Ensemble Transformer for Efficient and Accurate Ranking Tasks: an Application to Question Answering Systems
Yoshitomo Matsubara
Luca Soldaini
Eric Lind
Alessandro Moschitti
11
6
0
15 Jan 2022
Neighborhood Consensus Contrastive Learning for Backward-Compatible Representation
Shengsen Wu
Liang Chen
Yihang Lou
Yan Bai
Tao Bai
Minghua Deng
Ling-yu Duan
10
8
0
07 Aug 2021
Linking Common Vulnerabilities and Exposures to the MITRE ATT&CK Framework: A Self-Distillation Approach
Benjamin Ampel
Sagar Samtani
Steven Ullman
Hsinchun Chen
12
34
0
03 Aug 2021
Pre-trained Models for Natural Language Processing: A Survey
Xipeng Qiu
Tianxiang Sun
Yige Xu
Yunfan Shao
Ning Dai
Xuanjing Huang
LM&MA
VLM
241
1,444
0
18 Mar 2020
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
294
6,943
0
20 Apr 2018
1