ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2501.02156
  4. Cited By
The Race to Efficiency: A New Perspective on AI Scaling Laws

The Race to Efficiency: A New Perspective on AI Scaling Laws

4 January 2025
Chien-Ping Lu
ArXivPDFHTML

Papers citing "The Race to Efficiency: A New Perspective on AI Scaling Laws"

1 / 1 papers shown
Title
MoEQuant: Enhancing Quantization for Mixture-of-Experts Large Language Models via Expert-Balanced Sampling and Affinity Guidance
MoEQuant: Enhancing Quantization for Mixture-of-Experts Large Language Models via Expert-Balanced Sampling and Affinity Guidance
Xing Hu
Zhixuan Chen
Dawei Yang
Zukang Xu
Chen Xu
Zhihang Yuan
Sifan Zhou
Jiangyong Yu
MoE
MQ
39
0
0
02 May 2025
1