ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2411.11038
  4. Cited By

EfQAT: An Efficient Framework for Quantization-Aware Training

17 November 2024
Saleh Ashkboos
Bram-Ernst Verhoef
Torsten Hoefler
E. Eleftheriou
M. Dazzi
    MQ
ArXiv (abs)PDFHTML

Papers citing "EfQAT: An Efficient Framework for Quantization-Aware Training"

3 / 3 papers shown
Title
Energy-Efficient Domain-Specific Artificial Intelligence Models and Agents: Pathways and Paradigms
Energy-Efficient Domain-Specific Artificial Intelligence Models and Agents: Pathways and Paradigms
Abhijit Chatterjee
N. Jha
Jonathan D. Cohen
Thomas Griffiths
Hongjing Lu
Diana Marculescu
Ashiqur Rasul
Keshab K. Parhi
LLMAGAI4CE
244
0
0
24 Oct 2025
Layer-wise Quantization for Quantized Optimistic Dual Averaging
Layer-wise Quantization for Quantized Optimistic Dual Averaging
Anh Duc Nguyen
Ilia Markov
Frank Zhengqing Wu
Ali Ramezani-Kebrya
Kimon Antonakopoulos
Dan Alistarh
Volkan Cevher
MQ
191
1
0
20 May 2025
Automatic mixed precision for optimizing gained time with constrained loss mean-squared-error based on model partition to sequential sub-graphs
Automatic mixed precision for optimizing gained time with constrained loss mean-squared-error based on model partition to sequential sub-graphs
Shmulik Markovich-Golan
Daniel Ohayon
Itay Niv
Yair Hanani
MQ
264
0
0
19 May 2025
1