ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2509.02512
  4. Cited By
MoPEQ: Mixture of Mixed Precision Quantized Experts

MoPEQ: Mixture of Mixed Precision Quantized Experts

2 September 2025
Krishna Teja Chitty-Venkata
Jie Ye
M. Emani
    MoEMQ
ArXiv (abs)PDFHTMLGithub (612★)

Papers citing "MoPEQ: Mixture of Mixed Precision Quantized Experts"

1 / 1 papers shown
SignRoundV2: Closing the Performance Gap in Extremely Low-Bit Post-Training Quantization for LLMs
SignRoundV2: Closing the Performance Gap in Extremely Low-Bit Post-Training Quantization for LLMs
Wenhua Cheng
Weiwei Zhang
Heng Guo
Haihao Shen
MQ
116
0
0
04 Dec 2025
1