ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 0906.0652
78
14
v1v2 (latest)

Transductive versions of the LASSO and the Dantzig Selector

3 June 2009
Pierre Alquier
Mohamed Hebiri
ArXiv (abs)PDFHTML
Abstract

We consider the linear regression problem, where the number ppp of covariates is possibly larger than the number nnn of observations (xi,yi)i≤i≤n(x_{i},y_{i})_{i\leq i \leq n}(xi​,yi​)i≤i≤n​, under sparsity assumptions. On the one hand, several methods have been successfully proposed to perform this task, for example the LASSO or the Dantzig Selector. On the other hand, consider new values (xi)n+1≤i≤m(x_{i})_{n+1\leq i \leq m}(xi​)n+1≤i≤m​. If one wants to estimate the corresponding yiy_{i}yi​'s, one should think of a specific estimator devoted to this task, referred by Vapnik as a "transductive" estimator. This estimator may differ from an estimator designed to the more general task "estimate on the whole domain". In this work, we propose a generalized version both of the LASSO and the Dantzig Selector, based on the geometrical remarks about the LASSO in pr\évious works. The "usual" LASSO and Dantzig Selector, as well as new estimators interpreted as transductive versions of the LASSO, appear as special cases. These estimators are interesting at least from a theoretical point of view: we can give theoretical guarantees for these estimators under hypotheses that are relaxed versions of the hypotheses required in the papers about the "usual" LASSO. These estimators can also be efficiently computed, with results comparable to the ones of the LASSO.

View on arXiv
Comments on this paper