ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2105.12247
17
0

GraphVICRegHSIC: Towards improved self-supervised representation learning for graphs with a hyrbid loss function

25 May 2021
Sayan Nag
    SSL
ArXivPDFHTML
Abstract

Self-supervised learning and pre-training strategieshave developed over the last few years especiallyfor Convolutional Neural Networks (CNNs). Re-cently application of such methods can also be no-ticed for Graph Neural Networks (GNNs) . In thispaper, we have used a graph based self-supervisedlearning strategy with different loss functions (Bar-low Twins[Zbontaret al., 2021], HSIC[Tsaiet al.,2021], VICReg[Bardeset al., 2021]) which haveshown promising results when applied with CNNspreviously. We have also proposed a hybrid lossfunction combining the advantages of VICReg andHSIC and called it as VICRegHSIC. The perfor-mance of these aforementioned methods have beencompared when applied to 7 different datasets suchas MUTAG, PROTEINS, IMDB-Binary, etc. Ex-periments showed that our hybrid loss function per-formed better than the remaining ones in 4 out of7 cases. Moreover, the impact of different batchsizes, projector dimensions and data augmentationstrategies have also been explored.

View on arXiv
Comments on this paper