ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1902.01889
74
160

Analyzing and Improving Representations with the Soft Nearest Neighbor Loss

5 February 2019
Nicholas Frosst
Nicolas Papernot
Geoffrey E. Hinton
ArXiv (abs)PDFHTML
Abstract

We explore and expand the Soft Nearest Neighbor Loss\textit{Soft Nearest Neighbor Loss}Soft Nearest Neighbor Loss to measure the entanglement\textit{entanglement}entanglement of class manifolds in representation space: i.e., how close pairs of points from the same class are relative to pairs of points from different classes. We demonstrate several use cases of the loss. As an analytical tool, it provides insights into the evolution of class similarity structures during learning. Surprisingly, we find that maximizing\textit{maximizing}maximizing the entanglement of representations of different classes in the hidden layers is beneficial for discrimination in the final layer, possibly because it encourages representations to identify class-independent similarity structures. Maximizing the soft nearest neighbor loss in the hidden layers leads not only to improved generalization but also to better-calibrated estimates of uncertainty on outlier data. Data that is not from the training distribution can be recognized by observing that in the hidden layers, it has fewer than the normal number of neighbors from the predicted class.

View on arXiv
Comments on this paper