25
0

Probabilistic Forecasting for Building Energy Systems using Time-Series Foundation Models

Main:20 Pages
8 Figures
Bibliography:4 Pages
5 Tables
Appendix:7 Pages
Abstract

Decision-making in building energy systems critically depends on the predictive accuracy of relevant time-series models. In scenarios lacking extensive data from a target building, foundation models (FMs) represent a promising technology that can leverage prior knowledge from vast and diverse pre-training datasets to construct accurate probabilistic predictors for use in decision-making tools. This paper investigates the applicability and fine-tuning strategies of time-series foundation models (TSFMs) in building energy forecasting. We analyze both full fine-tuning and parameter-efficient fine-tuning approaches, particularly low-rank adaptation (LoRA), by using real-world data from a commercial net-zero energy building to capture signals such as room occupancy, carbon emissions, plug loads, and HVAC energy consumption. Our analysis reveals that the zero-shot predictive performance of TSFMs is generally suboptimal. To address this shortcoming, we demonstrate that employing either full fine-tuning or parameter-efficient fine-tuning significantly enhances forecasting accuracy, even with limited historical data. Notably, fine-tuning with low-rank adaptation (LoRA) substantially reduces computational costs without sacrificing accuracy. Furthermore, fine-tuned TSFMs consistently outperform state-of-the-art deep forecasting models (e.g., temporal fusion transformers) in accuracy, robustness, and generalization across varying building zones and seasonal conditions. These results underline the efficacy of TSFMs for practical, data-constrained building energy management systems, enabling improved decision-making in pursuit of energy efficiency and sustainability.

View on arXiv
@article{park2025_2506.00630,
  title={ Probabilistic Forecasting for Building Energy Systems using Time-Series Foundation Models },
  author={ Young Jin Park and Francois Germain and Jing Liu and Ye Wang and Toshiaki Koike-Akino and Gordon Wichern and Navid Azizan and Christopher R. Laughman and Ankush Chakrabarty },
  journal={arXiv preprint arXiv:2506.00630},
  year={ 2025 }
}
Comments on this paper