ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2308.16678
216
11

Dynamic nsNet2: Efficient Deep Noise Suppression with Early Exiting

International Workshop on Machine Learning for Signal Processing (MLSP), 2023
31 August 2023
Riccardo Miccini
Alaa Zniber
Clément Laroche
Tobias Piechowiak
Martin Schoeberl
Luca Pezzarossa
Ouassim Karrakchou
J. Sparsø
Mounir Ghogho
ArXiv (abs)PDFHTML
Abstract

Although deep learning has made strides in the field of deep noise suppression, leveraging deep architectures on resource-constrained devices still proved challenging. Therefore, we present an early-exiting model based on nsNet2 that provides several levels of accuracy and resource savings by halting computations at different stages. Moreover, we adapt the original architecture by splitting the information flow to take into account the injected dynamism. We show the trade-offs between performance and computational complexity based on established metrics.

View on arXiv
Comments on this paper