ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2409.19467
  4. Cited By
INSIGHTBUDDY-AI: Medication Extraction and Entity Linking using Large Language Models and Ensemble Learning

INSIGHTBUDDY-AI: Medication Extraction and Entity Linking using Large Language Models and Ensemble Learning

31 December 2024
Pablo Romero
Lifeng Han
Goran Nenadic
    LM&MA
ArXiv (abs)PDFHTML

Papers citing "INSIGHTBUDDY-AI: Medication Extraction and Entity Linking using Large Language Models and Ensemble Learning"

10 / 10 papers shown
Title
DeIDClinic: A Multi-Layered Framework for De-identification of Clinical
  Free-text Data
DeIDClinic: A Multi-Layered Framework for De-identification of Clinical Free-text Data
Angel Paul
Dhivin Shaji
Lifeng Han
Warren Del-Pinto
Goran Nenadic
OOD
57
0
0
02 Oct 2024
Domain-Specific Language Model Pretraining for Biomedical Natural
  Language Processing
Domain-Specific Language Model Pretraining for Biomedical Natural Language Processing
Yu Gu
Robert Tinn
Hao Cheng
Michael R. Lucas
Naoto Usuyama
Xiaodong Liu
Tristan Naumann
Jianfeng Gao
Hoifung Poon
LM&MAAI4CE
159
1,788
0
31 Jul 2020
Med-BERT: pre-trained contextualized embeddings on large-scale
  structured electronic health records for disease prediction
Med-BERT: pre-trained contextualized embeddings on large-scale structured electronic health records for disease prediction
L. Rasmy
Yang Xiang
Z. Xie
Cui Tao
Degui Zhi
AI4MHLM&MA
101
697
0
22 May 2020
Don't Stop Pretraining: Adapt Language Models to Domains and Tasks
Don't Stop Pretraining: Adapt Language Models to Domains and Tasks
Suchin Gururangan
Ana Marasović
Swabha Swayamdipta
Kyle Lo
Iz Beltagy
Doug Downey
Noah A. Smith
VLMAI4CECLL
175
2,446
0
23 Apr 2020
RoBERTa: A Robustly Optimized BERT Pretraining Approach
RoBERTa: A Robustly Optimized BERT Pretraining Approach
Yinhan Liu
Myle Ott
Naman Goyal
Jingfei Du
Mandar Joshi
Danqi Chen
Omer Levy
M. Lewis
Luke Zettlemoyer
Veselin Stoyanov
AIMat
717
24,597
0
26 Jul 2019
ClinicalBERT: Modeling Clinical Notes and Predicting Hospital
  Readmission
ClinicalBERT: Modeling Clinical Notes and Predicting Hospital Readmission
Kexin Huang
Jaan Altosaar
Rajesh Ranganath
OOD
140
908
0
10 Apr 2019
Publicly Available Clinical BERT Embeddings
Publicly Available Clinical BERT Embeddings
Emily Alsentzer
John R. Murphy
Willie Boag
W. Weng
Di Jin
Tristan Naumann
Matthew B. A. McDermott
AI4MH
208
1,991
0
06 Apr 2019
BioBERT: a pre-trained biomedical language representation model for
  biomedical text mining
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
Jinhyuk Lee
Wonjin Yoon
Sungdong Kim
Donghyeon Kim
Sunkyu Kim
Chan Ho So
Jaewoo Kang
OOD
195
5,688
0
25 Jan 2019
BERT: Pre-training of Deep Bidirectional Transformers for Language
  Understanding
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Jacob Devlin
Ming-Wei Chang
Kenton Lee
Kristina Toutanova
VLMSSLSSeg
1.8K
95,407
0
11 Oct 2018
Explainable Prediction of Medical Codes from Clinical Text
Explainable Prediction of Medical Codes from Clinical Text
J. Mullenbach
Sarah Wiegreffe
J. Duke
Jimeng Sun
Jacob Eisenstein
FAtt
95
576
0
15 Feb 2018
1