ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2112.11540
  4. Cited By
Mixed Precision of Quantization of Transformer Language Models for
  Speech Recognition

Mixed Precision of Quantization of Transformer Language Models for Speech Recognition

29 November 2021
Junhao Xu
Shoukang Hu
Jianwei Yu
Xunying Liu
Helen M. Meng
    MQ
ArXivPDFHTML

Papers citing "Mixed Precision of Quantization of Transformer Language Models for Speech Recognition"

2 / 2 papers shown
Title
Incremental Network Quantization: Towards Lossless CNNs with
  Low-Precision Weights
Incremental Network Quantization: Towards Lossless CNNs with Low-Precision Weights
Aojun Zhou
Anbang Yao
Yiwen Guo
Lin Xu
Yurong Chen
MQ
300
1,046
0
10 Feb 2017
A Decomposable Attention Model for Natural Language Inference
A Decomposable Attention Model for Natural Language Inference
Ankur P. Parikh
Oscar Täckström
Dipanjan Das
Jakob Uszkoreit
190
1,358
0
06 Jun 2016
1