Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1807.10585
Cited By
Filter Distillation for Network Compression
20 July 2018
Xavier Suau
Luca Zappella
N. Apostoloff
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Filter Distillation for Network Compression"
5 / 5 papers shown
Title
Compress and Compare: Interactively Evaluating Efficiency and Behavior Across ML Model Compression Experiments
Angie Boggust
Venkatesh Sivaraman
Yannick Assogba
Donghao Ren
Dominik Moritz
Fred Hohman
VLM
50
3
0
06 Aug 2024
Sauron U-Net: Simple automated redundancy elimination in medical image segmentation via filter pruning
Juan Miguel Valverde
Artem Shatillo
Jussi Tohka
AAML
18
5
0
27 Sep 2022
The Combinatorial Brain Surgeon: Pruning Weights That Cancel One Another in Neural Networks
Xin Yu
Thiago Serra
Srikumar Ramalingam
Shandian Zhe
34
48
0
09 Mar 2022
Toward Compact Deep Neural Networks via Energy-Aware Pruning
Seul-Ki Yeom
Kyung-Hwan Shim
Jee-Hyun Hwang
CVBM
20
12
0
19 Mar 2021
T-Basis: a Compact Representation for Neural Networks
Anton Obukhov
M. Rakhuba
Stamatios Georgoulis
Menelaos Kanakis
Dengxin Dai
Luc Van Gool
22
27
0
13 Jul 2020
1