ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2102.00554
  4. Cited By
Sparsity in Deep Learning: Pruning and growth for efficient inference
  and training in neural networks

Sparsity in Deep Learning: Pruning and growth for efficient inference and training in neural networks

31 January 2021
Torsten Hoefler
Dan Alistarh
Tal Ben-Nun
Nikoli Dryden
Alexandra Peste
    MQ
ArXivPDFHTML

Papers citing "Sparsity in Deep Learning: Pruning and growth for efficient inference and training in neural networks"

3 / 3 papers shown
Title
Switch-Based Multi-Part Neural Network
Switch-Based Multi-Part Neural Network
Surajit Majumder
Paritosh Ranjan
Prodip Roy
Bhuban Padhan
OOD
51
0
0
25 Apr 2025
Periodic Online Testing for Sparse Systolic Tensor Arrays
Periodic Online Testing for Sparse Systolic Tensor Arrays
C. Peltekis
Chrysostomos Nicopoulos
G. Dimitrakopoulos
32
31
0
25 Apr 2025
Always-Sparse Training by Growing Connections with Guided Stochastic Exploration
Always-Sparse Training by Growing Connections with Guided Stochastic Exploration
Mike Heddes
Narayan Srinivasa
T. Givargis
Alexandru Nicolau
74
0
0
12 Jan 2024
1