ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2405.03089
  4. Cited By
Structure-Preserving Network Compression Via Low-Rank Induced Training
  Through Linear Layers Composition

Structure-Preserving Network Compression Via Low-Rank Induced Training Through Linear Layers Composition

6 May 2024
Xitong Zhang
Ismail R. Alkhouri
Rongrong Wang
ArXivPDFHTML

Papers citing "Structure-Preserving Network Compression Via Low-Rank Induced Training Through Linear Layers Composition"

4 / 4 papers shown
Title
Diffusion Models: A Comprehensive Survey of Methods and Applications
Diffusion Models: A Comprehensive Survey of Methods and Applications
Ling Yang
Zhilong Zhang
Yingxia Shao
Shenda Hong
Runsheng Xu
Yue Zhao
Wentao Zhang
Bin Cui
Ming-Hsuan Yang
DiffM
MedIm
208
1,277
0
02 Sep 2022
Initialization and Regularization of Factorized Neural Layers
Initialization and Regularization of Factorized Neural Layers
M. Khodak
Neil A. Tenenholtz
Lester W. Mackey
Nicolò Fusi
63
56
0
03 May 2021
SCOP: Scientific Control for Reliable Neural Network Pruning
SCOP: Scientific Control for Reliable Neural Network Pruning
Yehui Tang
Yunhe Wang
Yixing Xu
Dacheng Tao
Chunjing Xu
Chao Xu
Chang Xu
AAML
32
153
0
21 Oct 2020
Group Sparsity: The Hinge Between Filter Pruning and Decomposition for
  Network Compression
Group Sparsity: The Hinge Between Filter Pruning and Decomposition for Network Compression
Yawei Li
Shuhang Gu
Christoph Mayer
Luc Van Gool
Radu Timofte
117
186
0
19 Mar 2020
1