ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1909.11481
  4. Cited By
CAT: Compression-Aware Training for bandwidth reduction

CAT: Compression-Aware Training for bandwidth reduction

25 September 2019
Chaim Baskin
Brian Chmiel
Evgenii Zheltonozhskii
Ron Banner
A. Bronstein
A. Mendelson
    MQ
ArXivPDFHTML

Papers citing "CAT: Compression-Aware Training for bandwidth reduction"

4 / 4 papers shown
Title
Rotation Invariant Quantization for Model Compression
Rotation Invariant Quantization for Model Compression
Dor-Joseph Kampeas
Yury Nahshan
Hanoch Kremer
Gil Lederman
Shira Zaloshinski
Zheng Li
E. Haleva
MQ
23
0
0
03 Mar 2023
Bimodal Distributed Binarized Neural Networks
Bimodal Distributed Binarized Neural Networks
T. Rozen
Moshe Kimhi
Brian Chmiel
A. Mendelson
Chaim Baskin
MQ
38
4
0
05 Apr 2022
Hybrid and Non-Uniform quantization methods using retro synthesis data
  for efficient inference
Hybrid and Non-Uniform quantization methods using retro synthesis data for efficient inference
Gvsl Tej Pratap
R. Kumar
MQ
21
1
0
26 Dec 2020
ZeroQ: A Novel Zero Shot Quantization Framework
ZeroQ: A Novel Zero Shot Quantization Framework
Yaohui Cai
Z. Yao
Zhen Dong
A. Gholami
Michael W. Mahoney
Kurt Keutzer
MQ
30
389
0
01 Jan 2020
1