ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1408.2327
144
67
v1v2v3v4v5v6v7v8v9 (latest)

On the Consistency of Ordinal Regression Methods

11 August 2014
Fabian Pedregosa
Francis R. Bach
Alexandre Gramfort
    MoMe
ArXiv (abs)PDFHTML
Abstract

Many of the ordinal regression models that have been proposed in the literature can be seen as methods that minimize a convex surrogate of the zero-one, absolute, or squared loss functions. A key property that allows to study the statistical implications of such approximations is that of Fisher consistency. In this paper we will characterize the Fisher consistency of a rich family of surrogate loss functions used in the context of ordinal regression, including support vector ordinal regression, ORBoosting and least absolute deviation. We will see that, for a family of surrogate loss functions that subsumes support vector ordinal regression and ORBoosting, consistency can be fully characterized by the derivative of a real-valued function at zero, as happens for convex margin-based surrogates in binary classification. We also derive excess risk bounds for a surrogate of the absolute error that generalize existing risk bounds for binary classification. Finally, our analysis suggests a novel surrogate of the squared error loss. To prove the empirical performance of such surrogate, we benchmarked it in terms of cross-validation error on 9 different datasets, where it outperforms competing approaches on 7 out of 9 datasets.

View on arXiv
Comments on this paper