ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1709.03714
  4. Cited By
RRA: Recurrent Residual Attention for Sequence Learning

RRA: Recurrent Residual Attention for Sequence Learning

12 September 2017
Cheng Wang
ArXiv (abs)PDFHTML

Papers citing "RRA: Recurrent Residual Attention for Sequence Learning"

2 / 2 papers shown
Title
Hinted Networks
Hinted Networks
J. Lamy-Poirier
Anqi Xu
32
0
0
15 Dec 2018
Self-Attentive Residual Decoder for Neural Machine Translation
Self-Attentive Residual Decoder for Neural Machine Translation
Lesly Miculicich
Nikolaos Pappas
Dhananjay Ram
Andrei Popescu-Belis
50
20
0
14 Sep 2017
1