ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2104.07085
  4. Cited By
Fast Walsh-Hadamard Transform and Smooth-Thresholding Based Binary
  Layers in Deep Neural Networks
v1v2v3v4 (latest)

Fast Walsh-Hadamard Transform and Smooth-Thresholding Based Binary Layers in Deep Neural Networks

14 April 2021
Hongyi Pan
Diaa Badawi
Ahmet Enis Cetin
ArXiv (abs)PDFHTML

Papers citing "Fast Walsh-Hadamard Transform and Smooth-Thresholding Based Binary Layers in Deep Neural Networks"

11 / 11 papers shown
HTMA-Net: Towards Multiplication-Avoiding Neural Networks via Hadamard Transform and In-Memory Computing
HTMA-Net: Towards Multiplication-Avoiding Neural Networks via Hadamard Transform and In-Memory Computing
Emadeldeen Hamdan
Ahmet Enis Cetin
227
1
0
27 Sep 2025
Containing Analog Data Deluge at Edge through Frequency-Domain
  Compression in Collaborative Compute-in-Memory Networks
Containing Analog Data Deluge at Edge through Frequency-Domain Compression in Collaborative Compute-in-Memory Networks
Nastaran Darabi
A. R. Trivedi
195
0
0
20 Sep 2023
Optimization Guarantees of Unfolded ISTA and ADMM Networks With Smooth
  Soft-Thresholding
Optimization Guarantees of Unfolded ISTA and ADMM Networks With Smooth Soft-ThresholdingIEEE Transactions on Signal Processing (IEEE TSP), 2023
Shaik Basheeruddin Shah
Pradyumna Pradhan
Wei Pu
Ramunaidu Randhi
Miguel R. D. Rodrigues
Yonina C. Eldar
375
17
0
12 Sep 2023
ADC/DAC-Free Analog Acceleration of Deep Neural Networks with Frequency
  Transformation
ADC/DAC-Free Analog Acceleration of Deep Neural Networks with Frequency Transformation
Nastaran Darabi
Maeesha Binte Hashem
Hongyi Pan
Ahmet Cetin
Wilfred Gomes
A. R. Trivedi
179
6
0
04 Sep 2023
A Hybrid Quantum-Classical Approach based on the Hadamard Transform for
  the Convolutional Layer
A Hybrid Quantum-Classical Approach based on the Hadamard Transform for the Convolutional LayerInternational Conference on Machine Learning (ICML), 2023
Hongyi Pan
Xin Zhu
S. Atici
Ahmet Enis Cetin
207
26
0
27 May 2023
Multichannel Orthogonal Transform-Based Perceptron Layers for Efficient
  ResNets
Multichannel Orthogonal Transform-Based Perceptron Layers for Efficient ResNetsIEEE Transactions on Neural Networks and Learning Systems (TNNLS), 2023
Hongyi Pan
Emadeldeen Hamdan
Xin Zhu
S. Atici
Ahmet Enis Cetin
OOD
250
11
0
13 Mar 2023
DCT Perceptron Layer: A Transform Domain Approach for Convolution Layer
DCT Perceptron Layer: A Transform Domain Approach for Convolution Layer
Hongyi Pan
Xin Zhu
S. Atici
Ahmet Enis Cetin
345
7
0
15 Nov 2022
Multipod Convolutional Network
Multipod Convolutional Network
Hongyi Pan
S. Atici
Ahmet Enis Cetin
164
2
0
03 Oct 2022
Block Walsh-Hadamard Transform Based Binary Layers in Deep Neural
  Networks
Block Walsh-Hadamard Transform Based Binary Layers in Deep Neural NetworksACM Transactions on Embedded Computing Systems (TECS), 2022
Hongyi Pan
Diaa Badawi
Ahmet Enis Cetin
345
26
0
07 Jan 2022
ReActNet: Towards Precise Binary Neural Network with Generalized
  Activation Functions
ReActNet: Towards Precise Binary Neural Network with Generalized Activation FunctionsEuropean Conference on Computer Vision (ECCV), 2020
Zechun Liu
Zhiqiang Shen
Marios Savvides
Kwang-Ting Cheng
MQ
681
416
0
07 Mar 2020
Xception: Deep Learning with Depthwise Separable Convolutions
Xception: Deep Learning with Depthwise Separable ConvolutionsComputer Vision and Pattern Recognition (CVPR), 2016
François Chollet
MDEBDLPINN
3.6K
17,433
0
07 Oct 2016
1
Page 1 of 1