ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.08027
  4. Cited By
v1v2 (latest)

Recipes for Pre-training LLMs with MXFP8

30 May 2025
Asit K. Mishra
Dusan Stosic
Simon Layton
Paulius Micikevicius
    MQ
ArXiv (abs)PDFHTML

Papers citing "Recipes for Pre-training LLMs with MXFP8"

2 / 2 papers shown
CafeQ: Calibration-free Quantization via Learned Transformations and Adaptive Rounding
CafeQ: Calibration-free Quantization via Learned Transformations and Adaptive Rounding
Ziteng Sun
Adrian Benton
Samuel Kushnir
Asher Trockman
Vikas Singh
Suhas Diggavi
A. Suresh
MQ
166
0
0
24 Nov 2025
Elucidating the Design Space of FP4 training
Elucidating the Design Space of FP4 training
Robert Hu
Carlo Luschi
Paul Balanca
MQ
118
0
0
22 Sep 2025
1
Page 1 of 1