8
1

RPMixer: Shaking Up Time Series Forecasting with Random Projections for Large Spatial-Temporal Data

Chin-Chia Michael Yeh
Yujie Fan
Xin Dai
Uday Singh Saini
Vivian Lai
Prince Osei Aboagye
Junpeng Wang
Huiyuan Chen
Yan Zheng
Zhongfang Zhuang
Liang Wang
Wei Zhang
Abstract

Spatial-temporal forecasting systems play a crucial role in addressing numerous real-world challenges. In this paper, we investigate the potential of addressing spatial-temporal forecasting problems using general time series forecasting models, i.e., models that do not leverage the spatial relationships among the nodes. We propose a all-Multi-Layer Perceptron (all-MLP) time series forecasting architecture called RPMixer. The all-MLP architecture was chosen due to its recent success in time series forecasting benchmarks. Furthermore, our method capitalizes on the ensemble-like behavior of deep neural networks, where each individual block within the network behaves like a base learner in an ensemble model, particularly when identity mapping residual connections are incorporated. By integrating random projection layers into our model, we increase the diversity among the blocks' outputs, thereby improving the overall performance of the network. Extensive experiments conducted on the largest spatial-temporal forecasting benchmark datasets demonstrate that the proposed method outperforms alternative methods, including both spatial-temporal graph models and general forecasting models.

View on arXiv
Comments on this paper