ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.12663
  4. Cited By
HOTCAKE: Higher Order Tucker Articulated Kernels for Deeper CNN
  Compression

HOTCAKE: Higher Order Tucker Articulated Kernels for Deeper CNN Compression

IEEE International Conference on Solid-State and Integrated Circuit Technology (ICSICT), 2020
28 February 2020
R. Lin
Ching-Yun Ko
Zhuolun He
Cong Chen
Yuan Cheng
Hao Yu
G. Chesi
Ngai Wong
ArXiv (abs)PDFHTML

Papers citing "HOTCAKE: Higher Order Tucker Articulated Kernels for Deeper CNN Compression"

3 / 3 papers shown
Scalable Efficient Training of Large Language Models with
  Low-dimensional Projected Attention
Scalable Efficient Training of Large Language Models with Low-dimensional Projected AttentionConference on Empirical Methods in Natural Language Processing (EMNLP), 2024
Xingtai Lv
Ning Ding
Kaiyan Zhang
Ermo Hua
Ganqu Cui
Bowen Zhou
257
7
0
04 Nov 2024
ELRT: Efficient Low-Rank Training for Compact Convolutional Neural
  Networks
ELRT: Efficient Low-Rank Training for Compact Convolutional Neural Networks
Yang Sui
Miao Yin
Yu Gong
Jinqi Xiao
Huy Phan
Bo Yuan
309
12
0
18 Jan 2024
ReLoRA: High-Rank Training Through Low-Rank Updates
ReLoRA: High-Rank Training Through Low-Rank UpdatesInternational Conference on Learning Representations (ICLR), 2023
Vladislav Lialin
Namrata Shivagunde
Sherin Muckatira
Anna Rumshisky
BDL
615
204
0
11 Jul 2023
1
Page 1 of 1