ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2406.06777
21
8

MolX: Enhancing Large Language Models for Molecular Learning with A Multi-Modal Extension

10 June 2024
Khiem Le
Zhichun Guo
Kaiwen Dong
Xiaobao Huang
B. Nan
Roshni G. Iyer
Xiangliang Zhang
Olaf Wiest
Wei Wang
Nitesh V. Chawla
ArXivPDFHTML
Abstract

Large Language Models (LLMs) with their strong task-handling capabilities have shown remarkable advancements across a spectrum of fields, moving beyond natural language understanding. However, their proficiency within the chemistry domain remains restricted, especially in solving professional molecule-related tasks. This challenge is attributed to their inherent limitations in comprehending molecules using only common textual representations, i.e., SMILES strings. In this study, we seek to enhance the ability of LLMs to comprehend molecules by equipping them with a multi-modal external module, namely MolX. In particular, instead of directly using a SMILES string to represent a molecule, we utilize specific encoders to extract fine-grained features from both SMILES string and 2D molecular graph representations for feeding into an LLM. Moreover, a handcrafted molecular fingerprint is incorporated to leverage its embedded domain knowledge. Then, to establish an alignment between MolX and the LLM's textual input space, the whole model in which the LLM is frozen, is pre-trained with a versatile strategy including a diverse set of tasks. Experimental evaluations show that our proposed method outperforms baselines across 4 downstream molecule-related tasks ranging from molecule-to-text translation to retrosynthesis, with and without fine-tuning the LLM, while only introducing a small number of trainable parameters 0.53% and 0.82%, respectively.

View on arXiv
@article{le2025_2406.06777,
  title={ MolX: Enhancing Large Language Models for Molecular Learning with A Multi-Modal Extension },
  author={ Khiem Le and Zhichun Guo and Kaiwen Dong and Xiaobao Huang and Bozhao Nan and Roshni Iyer and Xiangliang Zhang and Olaf Wiest and Wei Wang and Nitesh V. Chawla },
  journal={arXiv preprint arXiv:2406.06777},
  year={ 2025 }
}
Comments on this paper