71
0

Self-Supervised Graph Contrastive Pretraining for Device-level Integrated Circuits

Abstract

Self-supervised graph representation learning has driven significant advancements in domains such as social network analysis, molecular design, and electronics design automation (EDA). However, prior works in EDA have mainly focused on the representation of gate-level digital circuits, failing to capture analog and mixed-signal circuits. To address this gap, we introduce DICE: Device-level Integrated Circuits Encoder, the first self-supervised pretrained graph neural network (GNN) model for any circuit expressed at the device level. DICE is a message-passing neural network (MPNN) trained through graph contrastive learning, and its pretraining process is simulation-free, incorporating two novel data augmentation techniques. Experimental results demonstrate that DICE achieves substantial performance gains across three downstream tasks, underscoring its effectiveness for both analog and digital circuits.

View on arXiv
@article{lee2025_2502.08949,
  title={ Self-Supervised Graph Contrastive Pretraining for Device-level Integrated Circuits },
  author={ Sungyoung Lee and Ziyi Wang and Seunggeun Kim and Taekyun Lee and David Z. Pan },
  journal={arXiv preprint arXiv:2502.08949},
  year={ 2025 }
}
Comments on this paper