ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.14405
50
0

DUNE: Distilling a Universal Encoder from Heterogeneous 2D and 3D Teachers

18 March 2025
Mert Bulent Sariyildiz
Philippe Weinzaepfel
Thomas Lucas
Pau de Jorge
Diane Larlus
Yannis Kalantidis
ArXivPDFHTML
Abstract

Recent multi-teacher distillation methods have unified the encoders of multiple foundation models into a single encoder, achieving competitive performance on core vision tasks like classification, segmentation, and depth estimation. This led us to ask: Could similar success be achieved when the pool of teachers also includes vision models specialized in diverse tasks across both 2D and 3D perception? In this paper, we define and investigate the problem of heterogeneous teacher distillation, or co-distillation, a challenging multi-teacher distillation scenario where teacher models vary significantly in both (a) their design objectives and (b) the data they were trained on. We explore data-sharing strategies and teacher-specific encoding, and introduce DUNE, a single encoder excelling in 2D vision, 3D understanding, and 3D human perception. Our model achieves performance comparable to that of its larger teachers, sometimes even outperforming them, on their respective tasks. Notably, DUNE surpasses MASt3R in Map-free Visual Relocalization with a much smaller encoder.

View on arXiv
@article{sariyildiz2025_2503.14405,
  title={ DUNE: Distilling a Universal Encoder from Heterogeneous 2D and 3D Teachers },
  author={ Mert Bulent Sariyildiz and Philippe Weinzaepfel and Thomas Lucas and Pau de Jorge and Diane Larlus and Yannis Kalantidis },
  journal={arXiv preprint arXiv:2503.14405},
  year={ 2025 }
}
Comments on this paper