Simultaneous estimation of the mean and the variance in heteroscedastic Gaussian regression

Let be a Gaussian vector of of mean and diagonal covariance matrix . Our aim is to estimate both and the entries , for , on the basis of the observation of two independent copies of . Our approach is free of any prior assumption on but requires that we know some upper bound on the ratio . For example, the choice corresponds to the homoscedastic case where the components of are assumed to have common (unknown) variance. In the opposite, the choice corresponds to the heteroscedastic case where the variances of the components of are allowed to vary within some range. Our estimation strategy is based on model selection. We consider a family of parameter sets where and are linear spaces. To each , we associate a pair of estimators of with values in . Then we design a model selection procedure in view of selecting some among in such a way that the Kullback risk of is as close as possible to the minimum of the Kullback risks among the family of estimators . Then we derive uniform rates of convergence for the estimator over H\"{o}lderian balls. Finally, we carry out a simulation study in order to illustrate the performances of our estimators in practice.
View on arXiv