Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2109.02377
Cited By
PermuteFormer: Efficient Relative Position Encoding for Long Sequences
6 September 2021
Peng-Jen Chen
Re-assign community
ArXiv
PDF
HTML
Papers citing
"PermuteFormer: Efficient Relative Position Encoding for Long Sequences"
7 / 7 papers shown
Title
ContextSpeech: Expressive and Efficient Text-to-Speech for Paragraph Reading
Yujia Xiao
Shaofei Zhang
Xi Wang
Xuejiao Tan
Lei He
Sheng Zhao
Frank Soong
Tan Lee
17
5
0
03 Jul 2023
Efficient Attention via Control Variates
Lin Zheng
Jianbo Yuan
Chong-Jun Wang
Lingpeng Kong
19
18
0
09 Feb 2023
Lightweight Structure-Aware Attention for Visual Understanding
Heeseung Kwon
F. M. Castro
M. Marín-Jiménez
N. Guil
Alahari Karteek
15
2
0
29 Nov 2022
Word Order Matters when you Increase Masking
Karim Lasri
Alessandro Lenci
Thierry Poibeau
28
7
0
08 Nov 2022
Neural Architectures for Biological Inter-Sentence Relation Extraction
Enrique Noriega-Atala
Peter Lovett
Clayton T. Morrison
Mihai Surdeanu
NAI
20
3
0
17 Dec 2021
Ripple Attention for Visual Perception with Sub-quadratic Complexity
Lin Zheng
Huijie Pan
Lingpeng Kong
19
3
0
06 Oct 2021
Big Bird: Transformers for Longer Sequences
Manzil Zaheer
Guru Guruganesh
Kumar Avinava Dubey
Joshua Ainslie
Chris Alberti
...
Philip Pham
Anirudh Ravula
Qifan Wang
Li Yang
Amr Ahmed
VLM
249
2,009
0
28 Jul 2020
1