ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2305.16665
  4. Cited By
ABC-KD: Attention-Based-Compression Knowledge Distillation for Deep
  Learning-Based Noise Suppression

ABC-KD: Attention-Based-Compression Knowledge Distillation for Deep Learning-Based Noise Suppression

26 May 2023
Yixin Wan
Yuan-yuan Zhou
Xiulian Peng
Kai-Wei Chang
Yan Lu
ArXivPDFHTML

Papers citing "ABC-KD: Attention-Based-Compression Knowledge Distillation for Deep Learning-Based Noise Suppression"

2 / 2 papers shown
Title
Show, Attend and Distill:Knowledge Distillation via Attention-based
  Feature Matching
Show, Attend and Distill:Knowledge Distillation via Attention-based Feature Matching
Mingi Ji
Byeongho Heo
Sungrae Park
56
140
0
05 Feb 2021
Interspeech 2021 Deep Noise Suppression Challenge
Interspeech 2021 Deep Noise Suppression Challenge
Chandan K. A. Reddy
Harishchandra Dubey
K. Koishida
A. Nair
Vishak Gopal
Ross Cutler
Sebastian Braun
H. Gamper
R. Aichner
Sriram Srinivasan
AI4CE
72
160
0
06 Jan 2021
1