ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1703.00144
  4. Cited By
Theoretical Properties for Neural Networks with Weight Matrices of Low
  Displacement Rank

Theoretical Properties for Neural Networks with Weight Matrices of Low Displacement Rank

1 March 2017
Liang Zhao
Siyu Liao
Yanzhi Wang
Zhe Li
Jian Tang
Victor Pan
Bo Yuan
ArXivPDFHTML

Papers citing "Theoretical Properties for Neural Networks with Weight Matrices of Low Displacement Rank"

9 / 9 papers shown
Title
Block Circulant Adapter for Large Language Models
Block Circulant Adapter for Large Language Models
Xinyu Ding
Meiqi Wang
Siyu Liao
Zhongfeng Wang
31
0
0
01 May 2025
A Hardware-Efficient Photonic Tensor Core: Accelerating Deep Neural Networks with Structured Compression
A Hardware-Efficient Photonic Tensor Core: Accelerating Deep Neural Networks with Structured Compression
Shupeng Ning
Hanqing Zhu
Chenghao Feng
Jiaqi Gu
David Z. Pan
Ray T. Chen
36
0
0
01 Feb 2025
A Secure and Efficient Federated Learning Framework for NLP
A Secure and Efficient Federated Learning Framework for NLP
Jieren Deng
Chenghong Wang
Xianrui Meng
Yijue Wang
Ji Li
Sheng Lin
Shuo Han
Fei Miao
Sanguthevar Rajasekaran
Caiwen Ding
FedML
69
22
0
28 Jan 2022
REQ-YOLO: A Resource-Aware, Efficient Quantization Framework for Object
  Detection on FPGAs
REQ-YOLO: A Resource-Aware, Efficient Quantization Framework for Object Detection on FPGAs
Caiwen Ding
Shuo Wang
Ning Liu
Kaidi Xu
Yanzhi Wang
Yun Liang
MQ
13
89
0
29 Sep 2019
Understanding and Training Deep Diagonal Circulant Neural Networks
Understanding and Training Deep Diagonal Circulant Neural Networks
Alexandre Araujo
Benjamin Négrevergne
Y. Chevaleyre
Jamal Atif
19
4
0
29 Jan 2019
E-RNN: Design Optimization for Efficient Recurrent Neural Networks in
  FPGAs
E-RNN: Design Optimization for Efficient Recurrent Neural Networks in FPGAs
Zhe Li
Caiwen Ding
Siyue Wang
Wujie Wen
Youwei Zhuo
...
Qinru Qiu
Wenyao Xu
X. Lin
Xuehai Qian
Yanzhi Wang
MQ
9
64
0
12 Dec 2018
Learning Compressed Transforms with Low Displacement Rank
Learning Compressed Transforms with Low Displacement Rank
Anna T. Thomas
Albert Gu
Tri Dao
Atri Rudra
Christopher Ré
22
40
0
04 Oct 2018
FFT-Based Deep Learning Deployment in Embedded Systems
FFT-Based Deep Learning Deployment in Embedded Systems
Sheng Lin
Ning Liu
M. Nazemi
Hongjia Li
Caiwen Ding
Yanzhi Wang
Massoud Pedram
33
52
0
13 Dec 2017
Benefits of depth in neural networks
Benefits of depth in neural networks
Matus Telgarsky
130
602
0
14 Feb 2016
1