ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1912.00835
  4. Cited By
Low Rank Factorization for Compact Multi-Head Self-Attention

Low Rank Factorization for Compact Multi-Head Self-Attention

26 November 2019
Sneha Mehta
Huzefa Rangwala
Naren Ramakrishnan
ArXivPDFHTML

Papers citing "Low Rank Factorization for Compact Multi-Head Self-Attention"

3 / 3 papers shown
Title
A Decomposable Attention Model for Natural Language Inference
A Decomposable Attention Model for Natural Language Inference
Ankur P. Parikh
Oscar Täckström
Dipanjan Das
Jakob Uszkoreit
196
1,367
0
06 Jun 2016
Effective Approaches to Attention-based Neural Machine Translation
Effective Approaches to Attention-based Neural Machine Translation
Thang Luong
Hieu H. Pham
Christopher D. Manning
216
7,923
0
17 Aug 2015
Convolutional Neural Networks for Sentence Classification
Convolutional Neural Networks for Sentence Classification
Yoon Kim
AILaw
VLM
250
13,360
0
25 Aug 2014
1