Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2004.04729
Cited By
Dithered backprop: A sparse and quantized backpropagation algorithm for more efficient deep neural network training
9 April 2020
Simon Wiedemann
Temesgen Mehari
Kevin Kepp
Wojciech Samek
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Dithered backprop: A sparse and quantized backpropagation algorithm for more efficient deep neural network training"
6 / 6 papers shown
Title
Sparse is Enough in Fine-tuning Pre-trained Large Language Models
Weixi Song
Z. Li
Lefei Zhang
Hai Zhao
Bo Du
VLM
19
6
0
19 Dec 2023
Meta-Learning with a Geometry-Adaptive Preconditioner
Suhyun Kang
Duhun Hwang
Moonjung Eo
Taesup Kim
Wonjong Rhee
AI4CE
22
15
0
04 Apr 2023
AskewSGD : An Annealed interval-constrained Optimisation method to train Quantized Neural Networks
Louis Leconte
S. Schechtman
Eric Moulines
27
4
0
07 Nov 2022
Accurate Neural Training with 4-bit Matrix Multiplications at Standard Formats
Brian Chmiel
Ron Banner
Elad Hoffer
Hilla Ben Yaacov
Daniel Soudry
MQ
23
22
0
19 Dec 2021
No frame left behind: Full Video Action Recognition
X. Liu
S. Pintea
F. Karimi Nejadasl
O. Booij
J. C. V. Gemert
13
40
0
29 Mar 2021
DeepCABAC: A Universal Compression Algorithm for Deep Neural Networks
Simon Wiedemann
H. Kirchhoffer
Stefan Matlage
Paul Haase
Arturo Marbán
...
Ahmed Osman
D. Marpe
H. Schwarz
Thomas Wiegand
Wojciech Samek
33
92
0
27 Jul 2019
1