108
17
v1v2 (latest)

Simultaneous estimation of the mean and the variance in heteroscedastic Gaussian regression

Abstract

Let YY be a Gaussian vector of Rn\mathbb{R}^n of mean ss and diagonal covariance matrix Γ\Gamma. Our aim is to estimate both ss and the entries σi=Γi,i\sigma_i=\Gamma_{i,i}, for i=1,...,ni=1,...,n, on the basis of the observation of two independent copies of YY. Our approach is free of any prior assumption on ss but requires that we know some upper bound γ\gamma on the ratio maxiσi/miniσi\max_i\sigma_i/\min_i\sigma_i. For example, the choice γ=1\gamma=1 corresponds to the homoscedastic case where the components of YY are assumed to have common (unknown) variance. In the opposite, the choice γ>1\gamma>1 corresponds to the heteroscedastic case where the variances of the components of YY are allowed to vary within some range. Our estimation strategy is based on model selection. We consider a family {Sm×Σm,mM}\{S_m\times\Sigma_m, m\in\mathcal{M}\} of parameter sets where SmS_m and Σm\Sigma_m are linear spaces. To each mMm\in\mathcal{M}, we associate a pair of estimators (s^m,σ^m)(\hat{s}_m,\hat{\sigma}_m) of (s,σ)(s,\sigma) with values in Sm×ΣmS_m\times\Sigma_m. Then we design a model selection procedure in view of selecting some m^\hat{m} among M\mathcal{M} in such a way that the Kullback risk of (s^m^,σ^m^)(\hat{s}_{\hat{m}},\hat{\sigma}_{\hat{m}}) is as close as possible to the minimum of the Kullback risks among the family of estimators {(s^m,σ^m),mM}\{(\hat{s}_m,\hat{\sigma}_m), m\in\mathcal{M}\}. Then we derive uniform rates of convergence for the estimator (s^m^,σ^m^)(\hat{s}_{\hat{m}},\hat{\sigma}_{\hat{m}}) over H\"{o}lderian balls. Finally, we carry out a simulation study in order to illustrate the performances of our estimators in practice.

View on arXiv
Comments on this paper