ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2406.04512
  4. Cited By
To Distill or Not to Distill? On the Robustness of Robust Knowledge
  Distillation

To Distill or Not to Distill? On the Robustness of Robust Knowledge Distillation

6 June 2024
Abdul Waheed
Karima Kadaoui
Muhammad Abdul-Mageed
    VLM
ArXivPDFHTML

Papers citing "To Distill or Not to Distill? On the Robustness of Robust Knowledge Distillation"

1 / 1 papers shown
Title
FLEURS: Few-shot Learning Evaluation of Universal Representations of
  Speech
FLEURS: Few-shot Learning Evaluation of Universal Representations of Speech
Alexis Conneau
Min Ma
Simran Khanuja
Yu Zhang
Vera Axelrod
Siddharth Dalmia
Jason Riesa
Clara E. Rivera
Ankur Bapna
VLM
78
281
0
25 May 2022
1