ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2509.24552
128
1
v1v2 (latest)

Short window attention enables long-term memorization

29 September 2025
Loic Cabannes
Maximilian Beck
Gergely Szilvasy
Matthijs Douze
Maria Lomeli
Jade Copet
Pierre-Emmanuel Mazaré
Gabriel Synnaeve
Hervé Jégou
ArXiv (abs)PDFHTMLHuggingFace (2 upvotes)
Main:10 Pages
6 Figures
Bibliography:4 Pages
4 Tables
Appendix:2 Pages
Abstract

Recent works show that hybrid architectures combining sliding window softmax attention layers with linear recurrent neural network (RNN) layers outperform both of these architectures taken separately. However, the impact of the window length and the interplay between softmax attention and linear RNN layers remain under-studied. In this work, we introduce SWAX, a hybrid architecture consisting of sliding-window attention and xLSTM linear RNN layers.

View on arXiv
Comments on this paper