ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2404.16386
  4. Cited By
Promoting CNNs with Cross-Architecture Knowledge Distillation for
  Efficient Monocular Depth Estimation

Promoting CNNs with Cross-Architecture Knowledge Distillation for Efficient Monocular Depth Estimation

25 April 2024
Zhimeng Zheng
Tao Huang
Gongsheng Li
Zuyi Wang
ArXivPDFHTML

Papers citing "Promoting CNNs with Cross-Architecture Knowledge Distillation for Efficient Monocular Depth Estimation"

3 / 3 papers shown
Title
Aligning in a Compact Space: Contrastive Knowledge Distillation between
  Heterogeneous Architectures
Aligning in a Compact Space: Contrastive Knowledge Distillation between Heterogeneous Architectures
Hongjun Wu
Li Xiao
Xingkuo Zhang
Yining Miao
23
1
0
28 May 2024
Deep Ordinal Regression Network for Monocular Depth Estimation
Deep Ordinal Regression Network for Monocular Depth Estimation
Huan Fu
Mingming Gong
Chaohui Wang
Kayhan Batmanghelich
Dacheng Tao
MDE
180
1,687
0
06 Jun 2018
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision
  Applications
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
948
20,214
0
17 Apr 2017
1