ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2103.03798
11
13

Training a First-Order Theorem Prover from Synthetic Data

5 March 2021
Vlad Firoiu
Eser Aygun
Ankit Anand
Zafarali Ahmed
Xavier Glorot
Laurent Orseau
Lei Zhang
Doina Precup
Shibl Mourad
    NAI
ArXivPDFHTML
Abstract

A major challenge in applying machine learning to automated theorem proving is the scarcity of training data, which is a key ingredient in training successful deep learning models. To tackle this problem, we propose an approach that relies on training purely with synthetically generated theorems, without any human data aside from axioms. We use these theorems to train a neurally-guided saturation-based prover. Our neural prover outperforms the state-of-the-art E-prover on this synthetic data in both time and search steps, and shows significant transfer to the unseen human-written theorems from the TPTP library, where it solves 72\% of first-order problems without equality.

View on arXiv
Comments on this paper