ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1912.00349
  4. Cited By
Not All Attention Is Needed: Gated Attention Network for Sequence Data

Not All Attention Is Needed: Gated Attention Network for Sequence Data

1 December 2019
Lanqing Xue
Xiaopeng Li
N. Zhang
ArXivPDFHTML

Papers citing "Not All Attention Is Needed: Gated Attention Network for Sequence Data"

4 / 4 papers shown
Title
Graph-to-Text Generation with Dynamic Structure Pruning
Graph-to-Text Generation with Dynamic Structure Pruning
Liang Li
Ruiying Geng
Bowen Li
Can Ma
Yinliang Yue
Binhua Li
Yongbin Li
52
2
0
15 Sep 2022
Generating Diversified Comments via Reader-Aware Topic Modeling and
  Saliency Detection
Generating Diversified Comments via Reader-Aware Topic Modeling and Saliency Detection
Wei Wang
Piji Li
Haitao Zheng
11
13
0
13 Feb 2021
On Learning the Right Attention Point for Feature Enhancement
On Learning the Right Attention Point for Feature Enhancement
Liqiang Lin
Pengdi Huang
FU Chi-Wing
XU Kai
Hao Zhang
Hui Huang
3DPC
36
7
0
11 Dec 2020
Effective Approaches to Attention-based Neural Machine Translation
Effective Approaches to Attention-based Neural Machine Translation
Thang Luong
Hieu H. Pham
Christopher D. Manning
218
7,926
0
17 Aug 2015
1