ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2309.11687
  4. Cited By
Large-scale Pretraining Improves Sample Efficiency of Active Learning
  based Molecule Virtual Screening

Large-scale Pretraining Improves Sample Efficiency of Active Learning based Molecule Virtual Screening

20 September 2023
Zhonglin Cao
Simone Sciabola
Ye Wang
ArXivPDFHTML

Papers citing "Large-scale Pretraining Improves Sample Efficiency of Active Learning based Molecule Virtual Screening"

4 / 4 papers shown
Title
Analysis of Atom-level pretraining with Quantum Mechanics (QM) data for
  Graph Neural Networks Molecular property models
Analysis of Atom-level pretraining with Quantum Mechanics (QM) data for Graph Neural Networks Molecular property models
Jose A. Arjona-Medina
Ramil I. Nugmanov
AI4CE
21
1
0
23 May 2024
Active Learning and Bayesian Optimization: a Unified Perspective to
  Learn with a Goal
Active Learning and Bayesian Optimization: a Unified Perspective to Learn with a Goal
Francesco Di Fiore
Michela Nardelli
L. Mainini
20
22
0
02 Mar 2023
Improving Molecular Contrastive Learning via Faulty Negative Mitigation
  and Decomposed Fragment Contrast
Improving Molecular Contrastive Learning via Faulty Negative Mitigation and Decomposed Fragment Contrast
Yuyang Wang
Rishikesh Magar
Chen Liang
A. Farimani
38
78
0
18 Feb 2022
Accelerating high-throughput virtual screening through molecular
  pool-based active learning
Accelerating high-throughput virtual screening through molecular pool-based active learning
David E. Graff
E. Shakhnovich
Connor W. Coley
76
139
0
13 Dec 2020
1