ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1210.5830
79
19
v1v2v3 (latest)

VVV-fold cross-validation and VVV-fold penalization in least-squares density estimation

22 October 2012
Sylvain Arlot
M. Lerasle
ArXiv (abs)PDFHTML
Abstract

This paper studies VVV-fold cross-validation for model selection in least-squares density estimation. The goal is to provide theoretical grounds for choosing VVV in order to minimize the least-squares risk of the selected estimator. % We first prove a non asymptotic oracle inequality for VVV-fold cross-validation and its bias-corrected version (VVV-fold penalization), with an upper bound decreasing as a function of VVV. In particular, this result implies VVV-fold penalization is asymptotically optimal. % Then, we compute the variance of VVV-fold cross-validation and related criteria, as well as the variance of key quantities for model selection performances. We show these variances depend on VVV like 1+1/(V−1)1+1/(V-1)1+1/(V−1) (at least in some particular cases), suggesting the performances increase much from V=2 to V=5 or 10, and then is almost constant. % Overall, this explains the common advice to take V=10V=10 V=10---at least in our setting and when the computational power is limited---, as confirmed by some simulation experiments.

View on arXiv
Comments on this paper