ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1408.2327
152
67
v1v2v3v4v5v6v7v8v9 (latest)

On the Consistency of Ordinal Regression Methods

11 August 2014
Fabian Pedregosa
Francis R. Bach
Alexandre Gramfort
    MoMe
ArXiv (abs)PDFHTML
Abstract

Ordinal regression is a common supervised learning problem sharing properties with both regression and classification. Many of the ordinal regression algorithms that have been proposed can be viewed as methods that minimize a convex surrogate of the zero-one, absolute, or squared errors. We extend the notion of consistency which has been studied for classification, ranking and some ordinal regression models to the general setting of ordinal regression. We study a rich family of these surrogate loss functions and assess their consistency with both positive and negative results. For arbitrary loss functions that are admissible in the context of ordinal regression, we develop an approach that yields consistent surrogate loss functions. Finally, we illustrate our findings on real-world datasets.

View on arXiv
Comments on this paper