ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2012.07054
  4. Cited By
Adaptive and Oblivious Randomized Subspace Methods for High-Dimensional
  Optimization: Sharp Analysis and Lower Bounds

Adaptive and Oblivious Randomized Subspace Methods for High-Dimensional Optimization: Sharp Analysis and Lower Bounds

13 December 2020
Jonathan Lacotte
Mert Pilanci
ArXivPDFHTML

Papers citing "Adaptive and Oblivious Randomized Subspace Methods for High-Dimensional Optimization: Sharp Analysis and Lower Bounds"

5 / 5 papers shown
Title
Deep Sketched Output Kernel Regression for Structured Prediction
Deep Sketched Output Kernel Regression for Structured Prediction
T. Ahmad
Junjie Yang
Pierre Laforgue
Florence dÁlché-Buc
UQCV
50
0
0
13 Jun 2024
Fast Kernel Methods for Generic Lipschitz Losses via $p$-Sparsified
  Sketches
Fast Kernel Methods for Generic Lipschitz Losses via ppp-Sparsified Sketches
T. Ahmad
Pierre Laforgue
Florence dÁlché-Buc
19
5
0
08 Jun 2022
Distributed Sketching Methods for Privacy Preserving Regression
Distributed Sketching Methods for Privacy Preserving Regression
Burak Bartan
Mert Pilanci
29
11
0
16 Feb 2020
Reverse iterative volume sampling for linear regression
Reverse iterative volume sampling for linear regression
Michal Derezinski
Manfred K. Warmuth
51
43
0
06 Jun 2018
Sharp analysis of low-rank kernel matrix approximations
Sharp analysis of low-rank kernel matrix approximations
Francis R. Bach
88
281
0
09 Aug 2012
1