ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.02113
  4. Cited By
Mamba Neural Operator: Who Wins? Transformers vs. State-Space Models for PDEs

Mamba Neural Operator: Who Wins? Transformers vs. State-Space Models for PDEs

3 October 2024
Chun-Wun Cheng
Jiahao Huang
Yi Zhang
Guang Yang
Carola-Bibiane Schonlieb
Angelica I Aviles-Rivero
    Mamba
    AI4CE
ArXivPDFHTML

Papers citing "Mamba Neural Operator: Who Wins? Transformers vs. State-Space Models for PDEs"

1 / 1 papers shown
Title
Automatic selection of the best neural architecture for time series forecasting via multi-objective optimization and Pareto optimality conditions
Automatic selection of the best neural architecture for time series forecasting via multi-objective optimization and Pareto optimality conditions
Qianying Cao
Shanqing Liu
Alan John Varghese
Jérome Darbon
M. Triantafyllou
George Karniadakis
AI4TS
81
0
0
21 Jan 2025
1