ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2110.06467
  4. Cited By
Dual-branch Attention-In-Attention Transformer for single-channel speech
  enhancement

Dual-branch Attention-In-Attention Transformer for single-channel speech enhancement

13 October 2021
Guochen Yu
Andong Li
C. Zheng
Yinuo Guo
Yutian Wang
Hui Wang
ArXivPDFHTML

Papers citing "Dual-branch Attention-In-Attention Transformer for single-channel speech enhancement"

2 / 2 papers shown
Title
Dual-Path Transformer Network: Direct Context-Aware Modeling for
  End-to-End Monaural Speech Separation
Dual-Path Transformer Network: Direct Context-Aware Modeling for End-to-End Monaural Speech Separation
Jing-jing Chen
Qi-rong Mao
Dong Liu
34
252
0
28 Jul 2020
Real-Time Single Image and Video Super-Resolution Using an Efficient
  Sub-Pixel Convolutional Neural Network
Real-Time Single Image and Video Super-Resolution Using an Efficient Sub-Pixel Convolutional Neural Network
Wenzhe Shi
Jose Caballero
Ferenc Huszár
J. Totz
Andrew P. Aitken
Rob Bishop
Daniel Rueckert
Zehan Wang
SupR
169
4,748
0
16 Sep 2016
1