Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2406.01424
Cited By
Universal In-Context Approximation By Prompting Fully Recurrent Models
3 June 2024
Aleksandar Petrov
Tom A. Lamb
Alasdair Paren
Philip H. S. Torr
Adel Bibi
LRM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Universal In-Context Approximation By Prompting Fully Recurrent Models"
3 / 3 papers shown
Title
Griffin: Mixing Gated Linear Recurrences with Local Attention for Efficient Language Models
Soham De
Samuel L. Smith
Anushan Fernando
Aleksandar Botev
George-Christian Muraru
...
David Budden
Yee Whye Teh
Razvan Pascanu
Nando de Freitas
Çağlar Gülçehre
Mamba
53
116
0
29 Feb 2024
Resurrecting Recurrent Neural Networks for Long Sequences
Antonio Orvieto
Samuel L. Smith
Albert Gu
Anushan Fernando
Çağlar Gülçehre
Razvan Pascanu
Soham De
88
258
0
11 Mar 2023
Finding Alignments Between Interpretable Causal Variables and Distributed Neural Representations
Atticus Geiger
Zhengxuan Wu
Christopher Potts
Thomas F. Icard
Noah D. Goodman
CML
73
98
0
05 Mar 2023
1