49
9

Estimation of variance components, heritability and the ridge penalty in high-dimensional generalized linear models

Abstract

For high-dimensional linear regression models, we review and compare several estimators of variances τ2\tau^2 and σ2\sigma^2 of the random slopes and errors, respectively. These variances relate directly to ridge regression penalty λ\lambda and heritability index h2h^2, often used in genetics. Direct and indirect estimators of these, either based on cross-validation (CV) or maximum marginal likelihood (MML), are also discussed. The comparisons include several cases of covariate matrix Xn×p\mathbf{X}_{n \times p}, with pnp \gg n, such as multi-collinear covariates and data-derived ones. In addition, we study robustness against departures from the model such as sparse instead of dense effects and non-Gaussian errors. An example on weight gain data with genomic covariates confirms the good performance of MML compared to CV. Several extensions are presented. First, to the high-dimensional linear mixed effects model, with REML as an alternative to MML. Second, to the conjugate Bayesian setting, which proves to be a good alternative. Third, and most prominently, to generalized linear models for which we derive a computationally efficient MML estimator by re-writing the marginal likelihood as an nn-dimensional integral. For Poisson and Binomial ridge regression, we demonstrate the superior accuracy of the resulting MML estimator of λ\lambda as compared to CV. Software is provided to enable reproduction of all results presented here.

View on arXiv
Comments on this paper