ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2508.02343
  4. Cited By
MicroMix: Efficient Mixed-Precision Quantization with Microscaling Formats for Large Language Models

MicroMix: Efficient Mixed-Precision Quantization with Microscaling Formats for Large Language Models

4 August 2025
Wenyuan Liu
Haoqian Meng
Yilun Luo
Peng Zhang
Xindian Ma
    MQ
ArXiv (abs)PDFHTMLGithub (9★)

Papers citing "MicroMix: Efficient Mixed-Precision Quantization with Microscaling Formats for Large Language Models"

2 / 2 papers shown
SignRoundV2: Closing the Performance Gap in Extremely Low-Bit Post-Training Quantization for LLMs
SignRoundV2: Closing the Performance Gap in Extremely Low-Bit Post-Training Quantization for LLMs
Wenhua Cheng
Weiwei Zhang
Heng Guo
Haihao Shen
MQ
116
0
0
04 Dec 2025
Mixed-Precision Quantization for Language Models: Techniques and Prospects
Mixed-Precision Quantization for Language Models: Techniques and Prospects
M. Rakka
Marios Fournarakis
Olga Krestinskaya
Jinane Bazzi
K. Salama
Fadi J. Kurdahi
A. Eltawil
M. Fouda
MQ
235
0
0
19 Oct 2025
1
Page 1 of 1