Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2207.09094
Cited By
MoEC: Mixture of Expert Clusters
19 July 2022
Yuan Xie
Shaohan Huang
Tianyu Chen
Furu Wei
MoE
Re-assign community
ArXiv
PDF
HTML
Papers citing
"MoEC: Mixture of Expert Clusters"
3 / 3 papers shown
Title
HDMoLE: Mixture of LoRA Experts with Hierarchical Routing and Dynamic Thresholds for Fine-Tuning LLM-based ASR Models
Bingshen Mu
Kun Wei
Qijie Shao
Yong Xu
Lei Xie
MoE
29
1
0
30 Sep 2024
Tricks for Training Sparse Translation Models
Dheeru Dua
Shruti Bhosale
Vedanuj Goswami
James Cross
M. Lewis
Angela Fan
MoE
134
18
0
15 Oct 2021
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
294
6,927
0
20 Apr 2018
1