ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1311.2236
79
45
v1v2 (latest)

Fast Distribution To Real Regression

10 November 2013
Junier B. Oliva
Willie Neiswanger
Barnabás Póczós
J. Schneider
Eric Xing
ArXiv (abs)PDFHTML
Abstract

We study the problem of distribution to real-value regression, where one aims to regress a mapping fff that takes in a distribution input covariate P∈IP\in \mathcal{I}P∈I (for a non-parametric family of distributions I\mathcal{I}I) and outputs a real-valued response Y=f(P)+ϵY=f(P) + \epsilonY=f(P)+ϵ. This setting was recently studied, and a "Kernel-Kernel" estimator was introduced and shown to have a polynomial rate of convergence. However, evaluating a new prediction with the Kernel-Kernel estimator scales as Ω(N)\Omega(N)Ω(N). This causes the difficult situation where a large amount of data may be necessary for a low estimation risk, but the computation cost of estimation becomes infeasible when the data-set is too large. To this end, we propose the Double-Basis estimator, which looks to alleviate this big data problem in two ways: first, the Double-Basis estimator is shown to have a computation complexity that is independent of the number of of instances NNN when evaluating new predictions after training; secondly, the Double-Basis estimator is shown to have a fast rate of convergence for a general class of mappings f∈Ff\in\mathcal{F}f∈F.

View on arXiv
Comments on this paper