ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2210.07171
  4. Cited By
SQuAT: Sharpness- and Quantization-Aware Training for BERT

SQuAT: Sharpness- and Quantization-Aware Training for BERT

13 October 2022
Zheng Wang
Juncheng Billy Li
Shuhui Qu
Florian Metze
Emma Strubell
    MQ
ArXiv (abs)PDFHTML

Papers citing "SQuAT: Sharpness- and Quantization-Aware Training for BERT"

1 / 1 papers shown
Title
GSQ-Tuning: Group-Shared Exponents Integer in Fully Quantized Training for LLMs On-Device Fine-tuning
GSQ-Tuning: Group-Shared Exponents Integer in Fully Quantized Training for LLMs On-Device Fine-tuningAnnual Meeting of the Association for Computational Linguistics (ACL), 2025
Sifan Zhou
Shuo Wang
Zhihang Yuan
Mingjia Shi
Yuzhang Shang
Dawei Yang
MQALM
520
10
0
18 Feb 2025
1