ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1803.09926
  4. Cited By
Diagonalwise Refactorization: An Efficient Training Method for Depthwise
  Convolutions

Diagonalwise Refactorization: An Efficient Training Method for Depthwise Convolutions

27 March 2018
Zheng Qin
Zhaoning Zhang
Dongsheng Li
Yiming Zhang
Yuxing Peng
ArXivPDFHTML

Papers citing "Diagonalwise Refactorization: An Efficient Training Method for Depthwise Convolutions"

4 / 4 papers shown
Title
Efficiency is Not Enough: A Critical Perspective of Environmentally Sustainable AI
Efficiency is Not Enough: A Critical Perspective of Environmentally Sustainable AI
Dustin Wright
Christian Igel
Gabrielle Samuel
Raghavendra Selvan
29
15
0
05 Sep 2023
RevBiFPN: The Fully Reversible Bidirectional Feature Pyramid Network
RevBiFPN: The Fully Reversible Bidirectional Feature Pyramid Network
Vitaliy Chiley
Vithursan Thangarasa
Abhay Gupta
Anshul Samar
Joel Hestness
D. DeCoste
40
8
0
28 Jun 2022
ShortcutFusion: From Tensorflow to FPGA-based accelerator with
  reuse-aware memory allocation for shortcut data
ShortcutFusion: From Tensorflow to FPGA-based accelerator with reuse-aware memory allocation for shortcut data
Duy-Thanh Nguyen
Hyeonseung Je
Tuan Nghia Nguyen
Soojung Ryu
Kyujoong Lee
Hyuk-Jae Lee
11
23
0
15 Jun 2021
Incremental Network Quantization: Towards Lossless CNNs with
  Low-Precision Weights
Incremental Network Quantization: Towards Lossless CNNs with Low-Precision Weights
Aojun Zhou
Anbang Yao
Yiwen Guo
Lin Xu
Yurong Chen
MQ
316
1,047
0
10 Feb 2017
1