ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2311.07625
  4. Cited By
Activity Sparsity Complements Weight Sparsity for Efficient RNN
  Inference

Activity Sparsity Complements Weight Sparsity for Efficient RNN Inference

13 November 2023
Rishav Mukherji
Mark Schöne
Khaleelulla Khan Nazeer
Christian Mayr
Anand Subramoney
ArXivPDFHTML

Papers citing "Activity Sparsity Complements Weight Sparsity for Efficient RNN Inference"

5 / 5 papers shown
Title
Dual sparse training framework: inducing activation map sparsity via
  Transformed $\ell1$ regularization
Dual sparse training framework: inducing activation map sparsity via Transformed ℓ1\ell1ℓ1 regularization
Xiaolong Yu
Cong Tian
38
0
0
30 May 2024
Resurrecting Recurrent Neural Networks for Long Sequences
Resurrecting Recurrent Neural Networks for Long Sequences
Antonio Orvieto
Samuel L. Smith
Albert Gu
Anushan Fernando
Çağlar Gülçehre
Razvan Pascanu
Soham De
88
265
0
11 Mar 2023
Sparsity in Deep Learning: Pruning and growth for efficient inference
  and training in neural networks
Sparsity in Deep Learning: Pruning and growth for efficient inference and training in neural networks
Torsten Hoefler
Dan Alistarh
Tal Ben-Nun
Nikoli Dryden
Alexandra Peste
MQ
139
684
0
31 Jan 2021
Long short-term memory and learning-to-learn in networks of spiking
  neurons
Long short-term memory and learning-to-learn in networks of spiking neurons
G. Bellec
Darjan Salaj
Anand Subramoney
R. Legenstein
Wolfgang Maass
111
478
0
26 Mar 2018
Delta Networks for Optimized Recurrent Network Computation
Delta Networks for Optimized Recurrent Network Computation
Daniel Neil
Junhaeng Lee
T. Delbruck
Shih-Chii Liu
26
65
0
16 Dec 2016
1