ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.20524
  4. Cited By
Towards Fully FP8 GEMM LLM Training at Scale
v1v2 (latest)

Towards Fully FP8 GEMM LLM Training at Scale

26 May 2025
Alejandro Hernández Cano
Dhia Garbaya
Imanol Schlag
Martin Jaggi
    MQ
ArXiv (abs)PDFHTMLGithub

Papers citing "Towards Fully FP8 GEMM LLM Training at Scale"

2 / 2 papers shown
TWEO: Transformers Without Extreme Outliers Enables FP8 Training And Quantization For Dummies
TWEO: Transformers Without Extreme Outliers Enables FP8 Training And Quantization For Dummies
Guang Liang
Jie Shao
Ningyuan Tang
Xinyao Liu
Jianxin Wu
MQ
176
0
0
28 Nov 2025
Can Performant LLMs Be Ethical? Quantifying the Impact of Web Crawling Opt-Outs
Can Performant LLMs Be Ethical? Quantifying the Impact of Web Crawling Opt-Outs
Dongyang Fan
Vinko Sabolčec
Matin Ansaripour
Ayush Kumar Tarun
Martin Jaggi
Antoine Bosselut
Imanol Schlag
329
4
0
08 Apr 2025
1