ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2207.13354
150
2

Are Neighbors Enough? Multi-Head Neural n-gram can be Alternative to Self-attention

27 July 2022
Mengsay Loem
Sho Takase
Masahiro Kaneko
Naoaki Okazaki
ArXiv (abs)PDFHTML
Abstract

Impressive performance of Transformer has been attributed to self-attention, where dependencies between entire input in a sequence are considered at every position. In this work, we reform the neural nnn-gram model, which focuses on only several surrounding representations of each position, with the multi-head mechanism as in Vaswani et al.(2017). Through experiments on sequence-to-sequence tasks, we show that replacing self-attention in Transformer with multi-head neural nnn-gram can achieve comparable or better performance than Transformer. From various analyses on our proposed method, we find that multi-head neural nnn-gram is complementary to self-attention, and their combinations can further improve performance of vanilla Transformer.

View on arXiv
Comments on this paper