ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2111.01348
42
4
v1v2v3v4 (latest)

Faster Convex Lipschitz Regression via 2-block ADMM

2 November 2021
Ali Siahkamari
D. A. E. Acar
Christopher Liao
Kelly Geyer
Venkatesh Saligrama
Brian Kulis
ArXiv (abs)PDFHTML
Abstract

The task of approximating an arbitrary convex function arises in several learning problems such as convex regression, learning with a difference of convex (DC) functions, and approximating Bregman divergences. In this paper, we show how a broad class of convex function learning problems can be solved via a 2-block ADMM approach, where updates for each block can be computed in closed form. For the task of convex Lipschitz regression, we establish that our proposed algorithm converges with iteration complexity of O(nd/ϵ) O(n\sqrt{d}/\epsilon)O(nd​/ϵ) for a dataset X∈Rn×d X \in \mathbb R^{n\times d}X∈Rn×d and ϵ>0\epsilon > 0ϵ>0. Combined with per-iteration computation complexity, our method converges with the rate O(n3d1.5/ϵ+n2d2.5/ϵ+nd3/ϵ)O(n^3 d^{1.5}/\epsilon+n^2 d^{2.5}/\epsilon+n d^3/\epsilon)O(n3d1.5/ϵ+n2d2.5/ϵ+nd3/ϵ). This new rate improves the state of the art rate of O(n5d2/ϵ)O(n^5d^2/\epsilon)O(n5d2/ϵ) available by interior point methods if d=o(n4)d = o( n^4)d=o(n4). Further we provide similar solvers for DC regression and Bregman divergence learning. Unlike previous approaches, our method is amenable to the use of GPUs. We demonstrate on regression and metric learning experiments that our approach is up to 30 times faster than the existing method, and produces results that are comparable to state-of-the-art.

View on arXiv
Comments on this paper