Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2310.13526
Cited By
Controlled Randomness Improves the Performance of Transformer Models
20 October 2023
Tobias Deuβer
Cong Zhao
Wolfgang Krämer
David Leonhard
Christian Bauckhage
R. Sifa
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Controlled Randomness Improves the Performance of Transformer Models"
6 / 6 papers shown
Title
An Analysis of Abstractive Text Summarization Using Pre-trained Models
Tohida Rehman
S. Das
Debarshi Kumar Sanyal
S. Chattopadhyay
21
9
0
25 Feb 2023
Packed Levitated Marker for Entity and Relation Extraction
Deming Ye
Yankai Lin
Peng Li
Maosong Sun
129
105
0
13 Sep 2021
Mixout: Effective Regularization to Finetune Large-scale Pretrained Language Models
Cheolhyoung Lee
Kyunghyun Cho
Wanmo Kang
MoE
225
204
0
25 Sep 2019
Text Summarization with Pretrained Encoders
Yang Liu
Mirella Lapata
MILM
248
1,417
0
22 Aug 2019
Neural Legal Judgment Prediction in English
Ilias Chalkidis
Ion Androutsopoulos
Nikolaos Aletras
AILaw
ELM
101
325
0
05 Jun 2019
Improving neural networks by preventing co-adaptation of feature detectors
Geoffrey E. Hinton
Nitish Srivastava
A. Krizhevsky
Ilya Sutskever
Ruslan Salakhutdinov
VLM
243
7,597
0
03 Jul 2012
1