Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2303.12892
Cited By
Improving Transformer Performance for French Clinical Notes Classification Using Mixture of Experts on a Limited Dataset
22 March 2023
Thanh-Dung Le
P. Jouvet
R. Noumeir
MoE
MedIm
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Improving Transformer Performance for French Clinical Notes Classification Using Mixture of Experts on a Limited Dataset"
5 / 5 papers shown
Title
The Impact of LoRA Adapters for LLMs on Clinical NLP Classification Under Data Limitations
Thanh-Dung Le
T. Nguyen
Vu Nguyen Ha
19
0
0
27 Jul 2024
Generalization Error Analysis for Sparse Mixture-of-Experts: A Preliminary Study
Jinze Zhao
Peihao Wang
Zhangyang Wang
MoE
16
2
0
26 Mar 2024
Towards an empirical understanding of MoE design choices
Dongyang Fan
Bettina Messmer
Martin Jaggi
26
10
0
20 Feb 2024
Boosting Transformer's Robustness and Efficacy in PPG Signal Artifact Detection with Self-Supervised Learning
Thanh-Dung Le
19
1
0
02 Jan 2024
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
273
2,878
0
15 Sep 2016
1