ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2312.06253
  4. Cited By
Transformer Attractors for Robust and Efficient End-to-End Neural
  Diarization

Transformer Attractors for Robust and Efficient End-to-End Neural Diarization

11 December 2023
Lahiru Samarakoon
Samuel J. Broughton
Marc Härkönen
Ivan Fung
ArXivPDFHTML

Papers citing "Transformer Attractors for Robust and Efficient End-to-End Neural Diarization"

5 / 5 papers shown
Title
Neural Diarization with Non-autoregressive Intermediate Attractors
Neural Diarization with Non-autoregressive Intermediate Attractors
Yusuke Fujita
Tatsuya Komatsu
Robin Scheibler
Yusuke Kida
Tetsuji Ogawa
28
11
0
13 Mar 2023
Auxiliary Loss of Transformer with Residual Connection for End-to-End
  Speaker Diarization
Auxiliary Loss of Transformer with Residual Connection for End-to-End Speaker Diarization
Yechan Yu
Dongkeon Park
H. Kim
26
19
0
14 Oct 2021
A Review of Speaker Diarization: Recent Advances with Deep Learning
A Review of Speaker Diarization: Recent Advances with Deep Learning
Tae Jin Park
Naoyuki Kanda
Dimitrios Dimitriadis
Kyu Jeong Han
Shinji Watanabe
Shrikanth Narayanan
VLM
269
323
0
24 Jan 2021
End-to-End Neural Speaker Diarization with Self-attention
End-to-End Neural Speaker Diarization with Self-attention
Yusuke Fujita
Naoyuki Kanda
Shota Horiguchi
Yawen Xue
Kenji Nagamatsu
Shinji Watanabe
176
237
0
13 Sep 2019
End-to-End Neural Speaker Diarization with Permutation-Free Objectives
End-to-End Neural Speaker Diarization with Permutation-Free Objectives
Yusuke Fujita
Naoyuki Kanda
Shota Horiguchi
Kenji Nagamatsu
Shinji Watanabe
155
242
0
12 Sep 2019
1