ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.07692
13
35

A Simple Data Mixing Prior for Improving Self-Supervised Learning

15 June 2022
Sucheng Ren
Huiyu Wang
Zhengqi Gao
Shengfeng He
Alan Yuille
Yuyin Zhou
Cihang Xie
ArXivPDFHTML
Abstract

Data mixing (e.g., Mixup, Cutmix, ResizeMix) is an essential component for advancing recognition models. In this paper, we focus on studying its effectiveness in the self-supervised setting. By noticing the mixed images that share the same source images are intrinsically related to each other, we hereby propose SDMP, short for S\textbf{S}Simple D\textbf{D}Data M\textbf{M}Mixing P\textbf{P}Prior, to capture this straightforward yet essential prior, and position such mixed images as additional positive pairs\textbf{positive pairs}positive pairs to facilitate self-supervised representation learning. Our experiments verify that the proposed SDMP enables data mixing to help a set of self-supervised learning frameworks (e.g., MoCo) achieve better accuracy and out-of-distribution robustness. More notably, our SDMP is the first method that successfully leverages data mixing to improve (rather than hurt) the performance of Vision Transformers in the self-supervised setting. Code is publicly available at https://github.com/OliverRensu/SDMP

View on arXiv
Comments on this paper