ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1406.5311
89
26
v1v2 (latest)

Towards A Deeper Geometric, Analytic and Algorithmic Understanding of Margins

20 June 2014
Aaditya Ramdas
Javier F. Pena
ArXiv (abs)PDFHTML
Abstract

Given a matrix AAA, a linear feasibility problem (of which linear classification is a special case) aims to find a solution to a primal problem w:ATw>0w: A^Tw > 0w:ATw>0 or a certificate for the dual problem which is a probability distribution p:Ap=0p: Ap = 0p:Ap=0. Inspired by the continued importance of large margin classifiers in machine learning, this paper aims to deepen our understanding of a condition measure of AAA called \textit{margin} that determines the difficulty of both the above problems. To aid geometrical intuition, we establish new characterizations of the margin in terms of relevant balls, cones and hulls. Our main contribution is analytical, where we present generalizations of Gordan's theorem, and beautiful variants of Hoffman's theorems, both using margins. We end with shedding some new light on two classical iterative schemes, the Perceptron and Von-Neumann or Gilbert algorithms, whose convergence rates famously depend on the margin. Our results are relevant for a deeper understanding of margin-based learning and proving convergence rates of iterative schemes, apart from providing a unifying perspective on this vast topic.

View on arXiv
Comments on this paper