Two-Step Knowledge Distillation for Tiny Speech EnhancementIEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), 2023 |
ABC-KD: Attention-Based-Compression Knowledge Distillation for Deep
Learning-Based Noise SuppressionInterspeech (Interspeech), 2023 |
Multi-View Attention Transfer for Efficient Speech EnhancementInterspeech (Interspeech), 2022 |
Inference skipping for more efficient real-time speech enhancement with
parallel RNNsIEEE/ACM Transactions on Audio Speech and Language Processing (TASLP), 2022 |