ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2309.12412
  4. Cited By
Speeding up Resnet Architecture with Layers Targeted Low Rank
  Decomposition

Speeding up Resnet Architecture with Layers Targeted Low Rank Decomposition

21 September 2023
Walid Ahmed
Habib Hajimolahoseini
Austin Wen
Yang Liu
ArXivPDFHTML

Papers citing "Speeding up Resnet Architecture with Layers Targeted Low Rank Decomposition"

2 / 2 papers shown
Title
Training Acceleration of Low-Rank Decomposed Networks using Sequential
  Freezing and Rank Quantization
Training Acceleration of Low-Rank Decomposed Networks using Sequential Freezing and Rank Quantization
Habib Hajimolahoseini
Walid Ahmed
Yang Liu
OffRL
MQ
19
6
0
07 Sep 2023
A Short Study on Compressing Decoder-Based Language Models
A Short Study on Compressing Decoder-Based Language Models
Tianda Li
Yassir El Mesbahi
I. Kobyzev
Ahmad Rashid
A. Mahmud
Nithin Anchuri
Habib Hajimolahoseini
Yang Liu
Mehdi Rezagholizadeh
82
25
0
16 Oct 2021
1