ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2303.12892
  4. Cited By
Improving Transformer Performance for French Clinical Notes
  Classification Using Mixture of Experts on a Limited Dataset

Improving Transformer Performance for French Clinical Notes Classification Using Mixture of Experts on a Limited Dataset

22 March 2023
Thanh-Dung Le
P. Jouvet
R. Noumeir
    MoE
    MedIm
ArXivPDFHTML

Papers citing "Improving Transformer Performance for French Clinical Notes Classification Using Mixture of Experts on a Limited Dataset"

5 / 5 papers shown
Title
The Impact of LoRA Adapters for LLMs on Clinical NLP Classification
  Under Data Limitations
The Impact of LoRA Adapters for LLMs on Clinical NLP Classification Under Data Limitations
Thanh-Dung Le
T. Nguyen
Vu Nguyen Ha
19
0
0
27 Jul 2024
Generalization Error Analysis for Sparse Mixture-of-Experts: A
  Preliminary Study
Generalization Error Analysis for Sparse Mixture-of-Experts: A Preliminary Study
Jinze Zhao
Peihao Wang
Zhangyang Wang
MoE
16
2
0
26 Mar 2024
Towards an empirical understanding of MoE design choices
Towards an empirical understanding of MoE design choices
Dongyang Fan
Bettina Messmer
Martin Jaggi
26
10
0
20 Feb 2024
Boosting Transformer's Robustness and Efficacy in PPG Signal Artifact
  Detection with Self-Supervised Learning
Boosting Transformer's Robustness and Efficacy in PPG Signal Artifact Detection with Self-Supervised Learning
Thanh-Dung Le
19
1
0
02 Jan 2024
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp
  Minima
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
273
2,878
0
15 Sep 2016
1