Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2408.17280
Cited By
Flexible and Effective Mixing of Large Language Models into a Mixture of Domain Experts
30 August 2024
Rhui Dih Lee
L. Wynter
R. Ganti
MoE
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Flexible and Effective Mixing of Large Language Models into a Mixture of Domain Experts"
1 / 1 papers shown
Title
Self-MoE: Towards Compositional Large Language Models with Self-Specialized Experts
Junmo Kang
Leonid Karlinsky
Hongyin Luo
Zhen Wang
Jacob A. Hansen
James Glass
David D. Cox
Rameswar Panda
Rogerio Feris
Alan Ritter
MoMe
MoE
26
3
0
17 Jun 2024
1