A general white noise test based on kernel lag-window estimates of the spectral density operator

We propose a general white noise test for functional time series based on estimating a distance between the spectral density operator of a weakly stationary time series and the constant spectral density operator of an uncorrelated time series. The estimator that we propose is based on a kernel lag-window type estimator of the spectral density operator. When the observed time series is a strong white noise in a real separable Hilbert space, we show that the asymptotic distribution of the test statistic is standard normal, and we further show that the test statistic diverges for general serially correlated time series. These results recover as special cases those of Hong (1996) and Horv\áth et al. (2013). In order to implement the test, we propose and study a number of kernel and bandwidth choices, including a new data adaptive bandwidth, as well as data adaptive power transformations of the test statistic that improve the normal approximation in finite samples. A simulation study demonstrated that the proposed method has good size and improved power when compared to other methods available in the literature, while also offering a light computational burden.
View on arXiv