ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2005.04366
  4. Cited By
Compressing Recurrent Neural Networks Using Hierarchical Tucker Tensor
  Decomposition

Compressing Recurrent Neural Networks Using Hierarchical Tucker Tensor Decomposition

9 May 2020
Miao Yin
Siyu Liao
Xiao-Yang Liu
Xiaodong Wang
Bo Yuan
ArXivPDFHTML

Papers citing "Compressing Recurrent Neural Networks Using Hierarchical Tucker Tensor Decomposition"

4 / 4 papers shown
Title
Post-Training Network Compression for 3D Medical Image Segmentation:
  Reducing Computational Efforts via Tucker Decomposition
Post-Training Network Compression for 3D Medical Image Segmentation: Reducing Computational Efforts via Tucker Decomposition
Tobias Weber
Jakob Dexl
David Rügamer
Michael Ingrisch
MedIm
32
2
0
15 Apr 2024
HALOC: Hardware-Aware Automatic Low-Rank Compression for Compact Neural
  Networks
HALOC: Hardware-Aware Automatic Low-Rank Compression for Compact Neural Networks
Jinqi Xiao
Chengming Zhang
Yu Gong
Miao Yin
Yang Sui
Lizhi Xiang
Dingwen Tao
Bo Yuan
16
19
0
20 Jan 2023
CHIP: CHannel Independence-based Pruning for Compact Neural Networks
CHIP: CHannel Independence-based Pruning for Compact Neural Networks
Yang Sui
Miao Yin
Yi Xie
Huy Phan
S. Zonouz
Bo Yuan
VLM
14
127
0
26 Oct 2021
3U-EdgeAI: Ultra-Low Memory Training, Ultra-Low BitwidthQuantization,
  and Ultra-Low Latency Acceleration
3U-EdgeAI: Ultra-Low Memory Training, Ultra-Low BitwidthQuantization, and Ultra-Low Latency Acceleration
Yao Chen
Cole Hawkins
Kaiqi Zhang
Zheng-Wei Zhang
Cong Hao
11
8
0
11 May 2021
1