ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2310.00369
  4. Cited By
Distilling Inductive Bias: Knowledge Distillation Beyond Model
  Compression

Distilling Inductive Bias: Knowledge Distillation Beyond Model Compression

30 September 2023
Gousia Habib
Tausifa Jan Saleem
Brejesh Lall
    VLM
ArXivPDFHTML

Papers citing "Distilling Inductive Bias: Knowledge Distillation Beyond Model Compression"

2 / 2 papers shown
Title
MobileViT: Light-weight, General-purpose, and Mobile-friendly Vision
  Transformer
MobileViT: Light-weight, General-purpose, and Mobile-friendly Vision Transformer
Sachin Mehta
Mohammad Rastegari
ViT
189
1,210
0
05 Oct 2021
Transformers in Vision: A Survey
Transformers in Vision: A Survey
Salman Khan
Muzammal Naseer
Munawar Hayat
Syed Waqas Zamir
F. Khan
M. Shah
ViT
225
2,428
0
04 Jan 2021
1