ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.18821
  4. Cited By
CAMEx: Curvature-aware Merging of Experts
v1v2 (latest)

CAMEx: Curvature-aware Merging of Experts

International Conference on Learning Representations (ICLR), 2025
26 February 2025
Dung V. Nguyen
Minh H. Nguyen
Luc Q. Nguyen
R. Teo
T. Nguyen
Linh Duy Tran
    MoMe
ArXiv (abs)PDFHTMLGithub (24★)

Papers citing "CAMEx: Curvature-aware Merging of Experts"

5 / 5 papers shown
Expert Merging in Sparse Mixture of Experts with Nash Bargaining
Expert Merging in Sparse Mixture of Experts with Nash Bargaining
Dung V. Nguyen
Anh T. Nguyen
Minh H. Nguyen
Luc Q. Nguyen
Shiqi Jiang
Ethan Fetaya
Linh Duy Tran
Gal Chechik
T. Nguyen
MoMe
247
1
0
17 Oct 2025
Model Selection for Gaussian-gated Gaussian Mixture of Experts Using Dendrograms of Mixing Measures
Model Selection for Gaussian-gated Gaussian Mixture of Experts Using Dendrograms of Mixing Measures
Tuan Thai
TrungTin Nguyen
Dat Do
Nhat Ho
Christopher Drovandi
610
4
0
19 May 2025
MoLEx: Mixture of Layer Experts for Finetuning with Sparse Upcycling
MoLEx: Mixture of Layer Experts for Finetuning with Sparse UpcyclingInternational Conference on Learning Representations (ICLR), 2025
R. Teo
T. Nguyen
MoE
518
5
0
14 Mar 2025
Tight Clusters Make Specialized Experts
Tight Clusters Make Specialized ExpertsInternational Conference on Learning Representations (ICLR), 2025
Stefan K. Nielsen
R. Teo
Laziz U. Abdullaev
Tan M. Nguyen
MoE
523
6
0
21 Feb 2025
LIBMoE: A Library for comprehensive benchmarking Mixture of Experts in Large Language Models
LIBMoE: A Library for comprehensive benchmarking Mixture of Experts in Large Language Models
Nam V. Nguyen
Thong T. Doan
Luong Tran
Van Nguyen
Quang Pham
MoE
699
4
0
01 Nov 2024
1
Page 1 of 1