18
0

λλ-Regularized A-Optimal Design and its Approximation by λλ-Regularized Proportional Volume Sampling

Abstract

In this work, we study the λ\lambda-regularized AA-optimal design problem and introduce the λ\lambda-regularized proportional volume sampling algorithm, generalized from [Nikolov, Singh, and Tantipongpipat, 2019], for this problem with the approximation guarantee that extends upon the previous work. In this problem, we are given vectors v1,,vnRdv_1,\ldots,v_n\in\mathbb{R}^d in dd dimensions, a budget knk\leq n, and the regularizer parameter λ0\lambda\geq0, and the goal is to find a subset S[n]S\subseteq [n] of size kk that minimizes the trace of (iSvivi+λId)1\left(\sum_{i\in S}v_iv_i^\top + \lambda I_d\right)^{-1} where IdI_d is the d×dd\times d identity matrix. The problem is motivated from optimal design in ridge regression, where one tries to minimize the expected squared error of the ridge regression predictor from the true coefficient in the underlying linear model. We introduce λ\lambda-regularized proportional volume sampling and give its polynomial-time implementation to solve this problem. We show its (1+ϵ1+λ)(1+\frac{\epsilon}{\sqrt{1+\lambda'}})-approximation for k=Ω(dϵ+log1/ϵϵ2)k=\Omega\left(\frac d\epsilon+\frac{\log 1/\epsilon}{\epsilon^2}\right) where λ\lambda' is proportional to λ\lambda, extending the previous bound in [Nikolov, Singh, and Tantipongpipat, 2019] to the case λ>0\lambda>0 and obtaining asymptotic optimality as λ\lambda\rightarrow \infty.

View on arXiv
Comments on this paper