35

DropoutTS: Sample-Adaptive Dropout for Robust Time Series Forecasting

Siru Zhong
Yiqiu Liu
Zhiqing Cui
Zezhi Shao
Fei Wang
Qingsong Wen
Yuxuan Liang
Main:8 Pages
17 Figures
Bibliography:2 Pages
13 Tables
Appendix:10 Pages
Abstract

Deep time series models are vulnerable to noisy data ubiquitous in real-world applications. Existing robustness strategies either prune data or rely on costly prior quantification, failing to balance effectiveness and efficiency. In this paper, we introduce DropoutTS, a model-agnostic plugin that shifts the paradigm from "what" to learn to "how much" to learn. DropoutTS employs a Sample-Adaptive Dropout mechanism: leveraging spectral sparsity to efficiently quantify instance-level noise via reconstruction residuals, it dynamically calibrates model learning capacity by mapping noise to adaptive dropout rates - selectively suppressing spurious fluctuations while preserving fine-grained fidelity. Extensive experiments across diverse noise regimes and open benchmarks show DropoutTS consistently boosts superior backbones' performance, delivering advanced robustness with negligible parameter overhead and no architectural modifications. Our code is available atthis https URL.

View on arXiv
Comments on this paper