Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2410.02113
Cited By
Mamba Neural Operator: Who Wins? Transformers vs. State-Space Models for PDEs
3 October 2024
Chun-Wun Cheng
Jiahao Huang
Yi Zhang
Guang Yang
Carola-Bibiane Schonlieb
Angelica I Aviles-Rivero
Mamba
AI4CE
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Mamba Neural Operator: Who Wins? Transformers vs. State-Space Models for PDEs"
1 / 1 papers shown
Title
Automatic selection of the best neural architecture for time series forecasting via multi-objective optimization and Pareto optimality conditions
Qianying Cao
Shanqing Liu
Alan John Varghese
Jérome Darbon
M. Triantafyllou
George Karniadakis
AI4TS
81
0
0
21 Jan 2025
1