ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2008.06814
  4. Cited By
Cascaded channel pruning using hierarchical self-distillation

Cascaded channel pruning using hierarchical self-distillation

16 August 2020
Roy Miles
K. Mikolajczyk
ArXivPDFHTML

Papers citing "Cascaded channel pruning using hierarchical self-distillation"

2 / 2 papers shown
Title
Learning to Project for Cross-Task Knowledge Distillation
Learning to Project for Cross-Task Knowledge Distillation
Dylan Auty
Roy Miles
Benedikt Kolbeinsson
K. Mikolajczyk
40
0
0
21 Mar 2024
Deep Ordinal Regression Network for Monocular Depth Estimation
Deep Ordinal Regression Network for Monocular Depth Estimation
Huan Fu
Mingming Gong
Chaohui Wang
Kayhan Batmanghelich
Dacheng Tao
MDE
182
1,707
0
06 Jun 2018
1