ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1807.10585
  4. Cited By
Filter Distillation for Network Compression

Filter Distillation for Network Compression

20 July 2018
Xavier Suau
Luca Zappella
N. Apostoloff
ArXivPDFHTML

Papers citing "Filter Distillation for Network Compression"

5 / 5 papers shown
Title
Compress and Compare: Interactively Evaluating Efficiency and Behavior
  Across ML Model Compression Experiments
Compress and Compare: Interactively Evaluating Efficiency and Behavior Across ML Model Compression Experiments
Angie Boggust
Venkatesh Sivaraman
Yannick Assogba
Donghao Ren
Dominik Moritz
Fred Hohman
VLM
50
3
0
06 Aug 2024
Sauron U-Net: Simple automated redundancy elimination in medical image
  segmentation via filter pruning
Sauron U-Net: Simple automated redundancy elimination in medical image segmentation via filter pruning
Juan Miguel Valverde
Artem Shatillo
Jussi Tohka
AAML
18
5
0
27 Sep 2022
The Combinatorial Brain Surgeon: Pruning Weights That Cancel One Another
  in Neural Networks
The Combinatorial Brain Surgeon: Pruning Weights That Cancel One Another in Neural Networks
Xin Yu
Thiago Serra
Srikumar Ramalingam
Shandian Zhe
34
48
0
09 Mar 2022
Toward Compact Deep Neural Networks via Energy-Aware Pruning
Toward Compact Deep Neural Networks via Energy-Aware Pruning
Seul-Ki Yeom
Kyung-Hwan Shim
Jee-Hyun Hwang
CVBM
20
12
0
19 Mar 2021
T-Basis: a Compact Representation for Neural Networks
T-Basis: a Compact Representation for Neural Networks
Anton Obukhov
M. Rakhuba
Stamatios Georgoulis
Menelaos Kanakis
Dengxin Dai
Luc Van Gool
22
27
0
13 Jul 2020
1