ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2402.17710
  4. Cited By
Understanding Neural Network Binarization with Forward and Backward
  Proximal Quantizers

Understanding Neural Network Binarization with Forward and Backward Proximal Quantizers

27 February 2024
Yiwei Lu
Yaoliang Yu
Xinlin Li
Vahid Partovi Nia
    MQ
ArXivPDFHTML

Papers citing "Understanding Neural Network Binarization with Forward and Backward Proximal Quantizers"

2 / 2 papers shown
Title
Combiner: Full Attention Transformer with Sparse Computation Cost
Combiner: Full Attention Transformer with Sparse Computation Cost
Hongyu Ren
H. Dai
Zihang Dai
Mengjiao Yang
J. Leskovec
Dale Schuurmans
Bo Dai
73
77
0
12 Jul 2021
Forward and Backward Information Retention for Accurate Binary Neural
  Networks
Forward and Backward Information Retention for Accurate Binary Neural Networks
Haotong Qin
Ruihao Gong
Xianglong Liu
Mingzhu Shen
Ziran Wei
F. Yu
Jingkuan Song
MQ
117
321
0
24 Sep 2019
1