ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1911.01575
18
6

Importance Sampling via Local Sensitivity

4 November 2019
Anant Raj
Cameron Musco
Lester W. Mackey
ArXivPDFHTML
Abstract

Given a loss function F:X→R+F:\mathcal{X} \rightarrow \R^+F:X→R+ that can be written as the sum of losses over a large set of inputs a1,…,ana_1,\ldots, a_na1​,…,an​, it is often desirable to approximate FFF by subsampling the input points. Strong theoretical guarantees require taking into account the importance of each point, measured by how much its individual loss contributes to F(x)F(x)F(x). Maximizing this importance over all x∈Xx \in \mathcal{X}x∈X yields the \emph{sensitivity score} of aia_iai​. Sampling with probabilities proportional to these scores gives strong guarantees, allowing one to approximately minimize of FFF using just the subsampled points. Unfortunately, sensitivity sampling is difficult to apply since (1) it is unclear how to efficiently compute the sensitivity scores and (2) the sample size required is often impractically large. To overcome both obstacles we introduce \emph{local sensitivity}, which measures data point importance in a ball around some center x0x_0x0​. We show that the local sensitivity can be efficiently estimated using the \emph{leverage scores} of a quadratic approximation to FFF and that the sample size required to approximate FFF around x0x_0x0​ can be bounded. We propose employing local sensitivity sampling in an iterative optimization method and analyze its convergence when FFF is smooth and convex.

View on arXiv
Comments on this paper