ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1806.05794
  4. Cited By
RAPIDNN: In-Memory Deep Neural Network Acceleration Framework

RAPIDNN: In-Memory Deep Neural Network Acceleration Framework

15 June 2018
Mohsen Imani
Mohammad Samragh
Yeseong Kim
Saransh Gupta
F. Koushanfar
Tajana Simunic
ArXivPDFHTML

Papers citing "RAPIDNN: In-Memory Deep Neural Network Acceleration Framework"

3 / 3 papers shown
Title
LUT-DLA: Lookup Table as Efficient Extreme Low-Bit Deep Learning Accelerator
LUT-DLA: Lookup Table as Efficient Extreme Low-Bit Deep Learning Accelerator
Guoyu Li
Shengyu Ye
C. L. P. Chen
Yang Wang
Fan Yang
Ting Cao
Cheng Liu
Mohamed M. Sabry
Mao Yang
MQ
125
0
0
18 Jan 2025
FORMS: Fine-grained Polarized ReRAM-based In-situ Computation for
  Mixed-signal DNN Accelerator
FORMS: Fine-grained Polarized ReRAM-based In-situ Computation for Mixed-signal DNN Accelerator
Geng Yuan
Payman Behnam
Zhengang Li
Ali Shafiee
Sheng Lin
...
Hang Liu
Xuehai Qian
M. N. Bojnordi
Yanzhi Wang
Caiwen Ding
19
68
0
16 Jun 2021
3D-aCortex: An Ultra-Compact Energy-Efficient Neurocomputing Platform
  Based on Commercial 3D-NAND Flash Memories
3D-aCortex: An Ultra-Compact Energy-Efficient Neurocomputing Platform Based on Commercial 3D-NAND Flash Memories
Mohammad Bavandpour
Shubham Sahay
M. Mahmoodi
D. Strukov
11
29
0
07 Aug 2019
1