ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1905.05376
18
32

Dimensionality Reduction for Tukey Regression

14 May 2019
K. Clarkson
Ruosong Wang
David P. Woodruff
ArXivPDFHTML
Abstract

We give the first dimensionality reduction methods for the overconstrained Tukey regression problem. The Tukey loss function ∥y∥M=∑iM(yi)\|y\|_M = \sum_i M(y_i)∥y∥M​=∑i​M(yi​) has M(yi)≈∣yi∣pM(y_i) \approx |y_i|^pM(yi​)≈∣yi​∣p for residual errors yiy_iyi​ smaller than a prescribed threshold τ\tauτ, but M(yi)M(y_i)M(yi​) becomes constant for errors ∣yi∣>τ|y_i| > \tau∣yi​∣>τ. Our results depend on a new structural result, proven constructively, showing that for any ddd-dimensional subspace L⊂RnL \subset \mathbb{R}^nL⊂Rn, there is a fixed bounded-size subset of coordinates containing, for every y∈Ly \in Ly∈L, all the large coordinates, with respect to the Tukey loss function, of yyy. Our methods reduce a given Tukey regression problem to a smaller weighted version, whose solution is a provably good approximate solution to the original problem. Our reductions are fast, simple and easy to implement, and we give empirical results demonstrating their practicality, using existing heuristic solvers for the small versions. We also give exponential-time algorithms giving provably good solutions, and hardness results suggesting that a significant speedup in the worst case is unlikely.

View on arXiv
Comments on this paper