ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2403.13838
18
0

Circuit Transformer: A Transformer That Preserves Logical Equivalence

14 March 2024
Xihan Li
Xing Li
Lei Chen
Xing Zhang
Mingxuan Yuan
Jun Wang
ArXivPDFHTML
Abstract

Implementing Boolean functions with circuits consisting of logic gates is fundamental in digital computer design. However, the implemented circuit must be exactly equivalent, which hinders generative neural approaches on this task due to their occasionally wrong predictions. In this study, we introduce a generative neural model, the "Circuit Transformer", which eliminates such wrong predictions and produces logic circuits strictly equivalent to given Boolean functions. The main idea is a carefully designed decoding mechanism that builds a circuit step-by-step by generating tokens, which has beneficial "cutoff properties" that block a candidate token once it invalidate equivalence. In such a way, the proposed model works similar to typical LLMs while logical equivalence is strictly preserved. A Markov decision process formulation is also proposed for optimizing certain objectives of circuits. Experimentally, we trained an 88-million-parameter Circuit Transformer to generate equivalent yet more compact forms of input circuits, outperforming existing neural approaches on both synthetic and real world benchmarks, without any violation of equivalence constraints.

View on arXiv
@article{li2025_2403.13838,
  title={ Circuit Transformer: A Transformer That Preserves Logical Equivalence },
  author={ Xihan Li and Xing Li and Lei Chen and Xing Zhang and Mingxuan Yuan and Jun Wang },
  journal={arXiv preprint arXiv:2403.13838},
  year={ 2025 }
}
Comments on this paper