v1v2 (latest)
The noise level in linear regression with dependent data
Neural Information Processing Systems (NeurIPS), 2023
Main:12 Pages
Bibliography:3 Pages
Appendix:7 Pages
Abstract
We derive upper bounds for random design linear regression with dependent (-mixing) data absent any realizability assumptions. In contrast to the strictly realizable martingale noise regime, no sharp instance-optimal non-asymptotics are available in the literature. Up to constant factors, our analysis correctly recovers the variance term predicted by the Central Limit Theorem -- the noise level of the problem -- and thus exhibits graceful degradation as we introduce misspecification. Past a burn-in, our result is sharp in the moderate deviations regime, and in particular does not inflate the leading order term by mixing time factors.
View on arXivComments on this paper
