ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.03555
  4. Cited By
Supervised Learning: No Loss No Cry

Supervised Learning: No Loss No Cry

10 February 2020
Richard Nock
A. Menon
ArXiv (abs)PDFHTML

Papers citing "Supervised Learning: No Loss No Cry"

5 / 5 papers shown
Title
Selective Matching Losses -- Not All Scores Are Created Equal
Selective Matching Losses -- Not All Scores Are Created Equal
Gil I. Shamir
Manfred K. Warmuth
27
0
0
04 Jun 2025
How to Boost Any Loss Function
How to Boost Any Loss Function
Richard Nock
Yishay Mansour
62
0
0
02 Jul 2024
Being Properly Improper
Being Properly Improper
Tyler Sypherd
Richard Nock
Lalitha Sankar
FaML
94
10
0
18 Jun 2021
All your loss are belong to Bayes
All your loss are belong to Bayes
Christian J. Walder
Richard Nock
61
5
0
08 Jun 2020
A Tunable Loss Function for Robust Classification: Calibration,
  Landscape, and Generalization
A Tunable Loss Function for Robust Classification: Calibration, Landscape, and Generalization
Tyler Sypherd
Mario Díaz
J. Cava
Gautam Dasarathy
Peter Kairouz
Lalitha Sankar
67
29
0
05 Jun 2019
1