ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2205.02103
  4. Cited By
Efficient Accelerator for Dilated and Transposed Convolution with
  Decomposition

Efficient Accelerator for Dilated and Transposed Convolution with Decomposition

2 May 2022
Kuo-Wei Chang
Tian-Sheuan Chang
ArXivPDFHTML

Papers citing "Efficient Accelerator for Dilated and Transposed Convolution with Decomposition"

3 / 3 papers shown
Title
A 1.6-mW Sparse Deep Learning Accelerator for Speech Separation
A 1.6-mW Sparse Deep Learning Accelerator for Speech Separation
Chih-Chyau Yang
Tian-Sheuan Chang
31
0
0
15 Dec 2023
Reduce Computational Complexity for Convolutional Layers by Skipping
  Zeros
Reduce Computational Complexity for Convolutional Layers by Skipping Zeros
Zhiyi Zhang
Pengfei Zhang
Zhuopin Xu
Qi Wang
26
1
0
28 Jun 2023
ENet: A Deep Neural Network Architecture for Real-Time Semantic
  Segmentation
ENet: A Deep Neural Network Architecture for Real-Time Semantic Segmentation
Adam Paszke
Abhishek Chaurasia
Sangpil Kim
Eugenio Culurciello
SSeg
235
2,059
0
07 Jun 2016
1