ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2411.04165
  4. Cited By
Bio-xLSTM: Generative modeling, representation and in-context learning
  of biological and chemical sequences

Bio-xLSTM: Generative modeling, representation and in-context learning of biological and chemical sequences

6 November 2024
Niklas Schmidinger
Lisa Schneckenreiter
Philipp Seidl
Johannes Schimunek
Pieter-Jan Hoedt
Johannes Brandstetter
Andreas Mayr
Sohvi Luukkonen
Sepp Hochreiter
G. Klambauer
    MedIm
ArXivPDFHTML

Papers citing "Bio-xLSTM: Generative modeling, representation and in-context learning of biological and chemical sequences"

2 / 2 papers shown
Title
Tiled Flash Linear Attention: More Efficient Linear RNN and xLSTM Kernels
Tiled Flash Linear Attention: More Efficient Linear RNN and xLSTM Kernels
M. Beck
Korbinian Poppel
Phillip Lippe
Sepp Hochreiter
59
1
0
18 Mar 2025
A Large Recurrent Action Model: xLSTM enables Fast Inference for Robotics Tasks
A Large Recurrent Action Model: xLSTM enables Fast Inference for Robotics Tasks
Thomas Schmied
Thomas Adler
Vihang Patil
M. Beck
Korbinian Poppel
Johannes Brandstetter
G. Klambauer
Razvan Pascanu
Sepp Hochreiter
55
4
0
21 Feb 2025
1