ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1806.06949
  4. Cited By
Full deep neural network training on a pruned weight budget

Full deep neural network training on a pruned weight budget

11 June 2018
Maximilian Golub
G. Lemieux
Mieszko Lis
ArXivPDFHTML

Papers citing "Full deep neural network training on a pruned weight budget"

6 / 6 papers shown
Title
Sparsified Model Zoo Twins: Investigating Populations of Sparsified
  Neural Network Models
Sparsified Model Zoo Twins: Investigating Populations of Sparsified Neural Network Models
D. Honegger
Konstantin Schurholt
Damian Borth
20
4
0
26 Apr 2023
Competitive plasticity to reduce the energetic costs of learning
Competitive plasticity to reduce the energetic costs of learning
Mark C. W. van Rossum
8
2
0
04 Apr 2023
Dynamic Neural Network Architectural and Topological Adaptation and
  Related Methods -- A Survey
Dynamic Neural Network Architectural and Topological Adaptation and Related Methods -- A Survey
Lorenz Kummer
AI4CE
32
0
0
28 Jul 2021
Pruning Algorithms to Accelerate Convolutional Neural Networks for Edge
  Applications: A Survey
Pruning Algorithms to Accelerate Convolutional Neural Networks for Edge Applications: A Survey
Jiayi Liu
S. Tripathi
Unmesh Kurup
Mohak Shah
3DPC
MedIm
17
52
0
08 May 2020
Sparse Weight Activation Training
Sparse Weight Activation Training
Md Aamir Raihan
Tor M. Aamodt
32
72
0
07 Jan 2020
Incremental Network Quantization: Towards Lossless CNNs with
  Low-Precision Weights
Incremental Network Quantization: Towards Lossless CNNs with Low-Precision Weights
Aojun Zhou
Anbang Yao
Yiwen Guo
Lin Xu
Yurong Chen
MQ
311
1,047
0
10 Feb 2017
1