ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2310.13526
  4. Cited By
Controlled Randomness Improves the Performance of Transformer Models

Controlled Randomness Improves the Performance of Transformer Models

20 October 2023
Tobias Deuβer
Cong Zhao
Wolfgang Krämer
David Leonhard
Christian Bauckhage
R. Sifa
ArXivPDFHTML

Papers citing "Controlled Randomness Improves the Performance of Transformer Models"

6 / 6 papers shown
Title
An Analysis of Abstractive Text Summarization Using Pre-trained Models
An Analysis of Abstractive Text Summarization Using Pre-trained Models
Tohida Rehman
S. Das
Debarshi Kumar Sanyal
S. Chattopadhyay
21
9
0
25 Feb 2023
Packed Levitated Marker for Entity and Relation Extraction
Packed Levitated Marker for Entity and Relation Extraction
Deming Ye
Yankai Lin
Peng Li
Maosong Sun
129
105
0
13 Sep 2021
Mixout: Effective Regularization to Finetune Large-scale Pretrained
  Language Models
Mixout: Effective Regularization to Finetune Large-scale Pretrained Language Models
Cheolhyoung Lee
Kyunghyun Cho
Wanmo Kang
MoE
225
204
0
25 Sep 2019
Text Summarization with Pretrained Encoders
Text Summarization with Pretrained Encoders
Yang Liu
Mirella Lapata
MILM
248
1,417
0
22 Aug 2019
Neural Legal Judgment Prediction in English
Neural Legal Judgment Prediction in English
Ilias Chalkidis
Ion Androutsopoulos
Nikolaos Aletras
AILaw
ELM
101
325
0
05 Jun 2019
Improving neural networks by preventing co-adaptation of feature
  detectors
Improving neural networks by preventing co-adaptation of feature detectors
Geoffrey E. Hinton
Nitish Srivastava
A. Krizhevsky
Ilya Sutskever
Ruslan Salakhutdinov
VLM
243
7,597
0
03 Jul 2012
1