ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2311.18780
11
0

MultiResFormer: Transformer with Adaptive Multi-Resolution Modeling for General Time Series Forecasting

30 November 2023
Linfeng Du
Ji Xin
Alex Labach
S. Zuberi
M. Volkovs
Rahul G. Krishnan
    AI4TS
ArXivPDFHTML
Abstract

Transformer-based models have greatly pushed the boundaries of time series forecasting recently. Existing methods typically encode time series data into patches\textit{patches}patches using one or a fixed set of patch lengths. This, however, could result in a lack of ability to capture the variety of intricate temporal dependencies present in real-world multi-periodic time series. In this paper, we propose MultiResFormer, which dynamically models temporal variations by adaptively choosing optimal patch lengths. Concretely, at the beginning of each layer, time series data is encoded into several parallel branches, each using a detected periodicity, before going through the transformer encoder block. We conduct extensive evaluations on long- and short-term forecasting datasets comparing MultiResFormer with state-of-the-art baselines. MultiResFormer outperforms patch-based Transformer baselines on long-term forecasting tasks and also consistently outperforms CNN baselines by a large margin, while using much fewer parameters than these baselines.

View on arXiv
Comments on this paper