ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.04937
42
0

Generalization Analysis for Contrastive Representation Learning under Non-IID Settings

8 May 2025
Nong Minh Hieu
Antoine Ledent
ArXivPDFHTML
Abstract

Contrastive Representation Learning (CRL) has achieved impressive success in various domains in recent years. Nevertheless, the theoretical understanding of the generalization behavior of CRL is limited. Moreover, to the best of our knowledge, the current literature only analyzes generalization bounds under the assumption that the data tuples used for contrastive learning are independently and identically distributed. However, in practice, we are often limited to a fixed pool of reusable labeled data points, making it inevitable to recycle data across tuples to create sufficiently large datasets. Therefore, the tuple-wise independence condition imposed by previous works is invalidated. In this paper, we provide a generalization analysis for the CRL framework under non-i.i.d.i.i.d.i.i.d. settings that adheres to practice more realistically. Drawing inspiration from the literature on U-statistics, we derive generalization bounds which indicate the required number of samples in each class scales as the logarithm of the covering number of the class of learnable feature representations associated to each class. Next, we apply our main results to derive excess risk bounds for common function classes such as linear maps and neural networks.

View on arXiv
@article{hieu2025_2505.04937,
  title={ Generalization Analysis for Contrastive Representation Learning under Non-IID Settings },
  author={ Nong Minh Hieu and Antoine Ledent },
  journal={arXiv preprint arXiv:2505.04937},
  year={ 2025 }
}
Comments on this paper