ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2305.15010
  4. Cited By
Injecting Knowledge into Biomedical Pre-trained Models via Polymorphism
  and Synonymous Substitution

Injecting Knowledge into Biomedical Pre-trained Models via Polymorphism and Synonymous Substitution

24 May 2023
Hongbo Zhang
Xiang Wan
Benyou Wang
    KELM
ArXiv (abs)PDFHTMLGithub (2★)

Papers citing "Injecting Knowledge into Biomedical Pre-trained Models via Polymorphism and Synonymous Substitution"

1 / 1 papers shown
Pre-trained Language Models in Biomedical Domain: A Systematic Survey
Pre-trained Language Models in Biomedical Domain: A Systematic SurveyACM Computing Surveys (CSUR), 2021
Benyou Wang
Qianqian Xie
Jiahuan Pei
Zhihong Chen
Prayag Tiwari
Zhao Li
Jie Fu
LM&MAAI4CE
481
211
0
11 Oct 2021
1
Page 1 of 1