Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1707.00110
Cited By
Efficient Attention using a Fixed-Size Memory Representation
1 July 2017
D. Britz
M. Guan
Minh-Thang Luong
3DV
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Efficient Attention using a Fixed-Size Memory Representation"
7 / 7 papers shown
Title
LambdaNetworks: Modeling Long-Range Interactions Without Attention
Irwan Bello
281
179
0
17 Feb 2021
Fact-based Text Editing
Hayate Iso
Chao Qiao
Hang Li
KELM
39
23
0
02 Jul 2020
Attentive Weakly Supervised land cover mapping for object-based satellite image time series data with spatial interpretation
Dino Ienco
Y. J. E. Gbodjo
R. Interdonato
R. Gaetano
26
2
0
30 Apr 2020
Generating Long Sequences with Sparse Transformers
R. Child
Scott Gray
Alec Radford
Ilya Sutskever
11
1,848
0
23 Apr 2019
Variational Memory Encoder-Decoder
Hung Le
T. Tran
Thin Nguyen
Svetha Venkatesh
VLM
DRL
15
32
0
26 Jul 2018
Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation
Yonghui Wu
M. Schuster
Z. Chen
Quoc V. Le
Mohammad Norouzi
...
Alex Rudnick
Oriol Vinyals
G. Corrado
Macduff Hughes
J. Dean
AIMat
716
6,746
0
26 Sep 2016
Effective Approaches to Attention-based Neural Machine Translation
Thang Luong
Hieu H. Pham
Christopher D. Manning
218
7,926
0
17 Aug 2015
1