ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2211.13756
  4. Cited By
Contrastive pretraining for semantic segmentation is robust to noisy
  positive pairs

Contrastive pretraining for semantic segmentation is robust to noisy positive pairs

24 November 2022
Sebastian Gerard
Josephine Sullivan
ArXivPDFHTML

Papers citing "Contrastive pretraining for semantic segmentation is robust to noisy positive pairs"

2 / 2 papers shown
Title
Self-supervised pre-training enhances change detection in Sentinel-2
  imagery
Self-supervised pre-training enhances change detection in Sentinel-2 imagery
Marrit Leenstra
Diego Marcos
F. Bovolo
D. Tuia
SSL
72
42
0
20 Jan 2021
Improved Baselines with Momentum Contrastive Learning
Improved Baselines with Momentum Contrastive Learning
Xinlei Chen
Haoqi Fan
Ross B. Girshick
Kaiming He
SSL
238
3,367
0
09 Mar 2020
1