ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2311.09406
  4. Cited By
Alternatives to the Scaled Dot Product for Attention in the Transformer
  Neural Network Architecture

Alternatives to the Scaled Dot Product for Attention in the Transformer Neural Network Architecture

15 November 2023
James Bernhard
ArXivPDFHTML

Papers citing "Alternatives to the Scaled Dot Product for Attention in the Transformer Neural Network Architecture"

1 / 1 papers shown
Title
Effective Approaches to Attention-based Neural Machine Translation
Effective Approaches to Attention-based Neural Machine Translation
Thang Luong
Hieu H. Pham
Christopher D. Manning
220
7,930
0
17 Aug 2015
1