ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2307.01482
15
2

Contextualizing MLP-Mixers Spatiotemporally for Urban Data Forecast at Scale

4 July 2023
Tong Nie
Guoyang Qin
Lijun Sun
Wei Ma
Yuewen Mei
Jiangming Sun
    AI4TS
ArXivPDFHTML
Abstract

Spatiotemporal urban data (STUD) displays complex correlational patterns. Extensive advanced techniques have been designed to capture these patterns for effective forecasting. However, because STUD is often massive in scale, practitioners need to strike a balance between effectiveness and efficiency by choosing computationally efficient models. An alternative paradigm called MLP-Mixer has the potential for both simplicity and effectiveness. Taking inspiration from its success in other domains, we propose an adapted version, named NexuSQN, for STUD forecast at scale. We identify the challenges faced when directly applying MLP-Mixers as series- and window-wise multivaluedness and propose the ST-contextualization to distinguish between spatial and temporal patterns. Experimental results surprisingly demonstrate that MLP-Mixers with ST-contextualization can rival SOTA performance when tested on several urban benchmarks. Furthermore, it was deployed in a collaborative urban congestion project with Baidu, specifically evaluating its ability to forecast traffic states in megacities like Beijing and Shanghai. Our findings contribute to the exploration of simple yet effective models for real-world STUD forecasting.

View on arXiv
Comments on this paper