ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2109.07710
  4. Cited By
Exploiting Activation based Gradient Output Sparsity to Accelerate
  Backpropagation in CNNs

Exploiting Activation based Gradient Output Sparsity to Accelerate Backpropagation in CNNs

16 September 2021
Anup Sarma
Sonali Singh
Huaipan Jiang
Ashutosh Pattnaik
Asit K. Mishra
N. Vijaykrishnan
M. Kandemir
Chita R. Das
ArXiv (abs)PDFHTMLGithub

Papers citing "Exploiting Activation based Gradient Output Sparsity to Accelerate Backpropagation in CNNs"

0 / 0 papers shown

No papers found

Page 1 of 0