ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2106.12810
  4. Cited By
A Construction Kit for Efficient Low Power Neural Network Accelerator
  Designs

A Construction Kit for Efficient Low Power Neural Network Accelerator Designs

24 June 2021
Petar Jokic
E. Azarkhish
Andrea Bonetti
M. Pons
S. Emery
Luca Benini
ArXivPDFHTML

Papers citing "A Construction Kit for Efficient Low Power Neural Network Accelerator Designs"

3 / 3 papers shown
Title
Sparsity in Deep Learning: Pruning and growth for efficient inference
  and training in neural networks
Sparsity in Deep Learning: Pruning and growth for efficient inference and training in neural networks
Torsten Hoefler
Dan Alistarh
Tal Ben-Nun
Nikoli Dryden
Alexandra Peste
MQ
153
685
0
31 Jan 2021
Benchmarking TinyML Systems: Challenges and Direction
Benchmarking TinyML Systems: Challenges and Direction
Colby R. Banbury
Vijay Janapa Reddi
Max Lam
William Fu
A. Fazel
...
Jae-sun Seo
Jeff Sieracki
Urmish Thakker
Marian Verhelst
Poonam Yadav
117
230
0
10 Mar 2020
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision
  Applications
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
950
20,613
0
17 Apr 2017
1