Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2407.14076
Cited By
Domain-Specific Pretraining of Language Models: A Comparative Study in the Medical Field
19 July 2024
Tobias Kerner
ELM
LM&MA
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Domain-Specific Pretraining of Language Models: A Comparative Study in the Medical Field"
5 / 5 papers shown
Title
Task-Adaptive Pretrained Language Models via Clustered-Importance Sampling
David Grangier
Simin Fan
Skyler Seto
Pierre Ablin
34
3
0
30 Sep 2024
BioMedLM: A 2.7B Parameter Language Model Trained On Biomedical Text
Elliot Bolton
Abhinav Venigalla
Michihiro Yasunaga
David Leo Wright Hall
Betty Xiong
...
R. Daneshjou
Jonathan Frankle
Percy Liang
Michael Carbin
Christopher D. Manning
LM&MA
MedIm
29
51
0
27 Mar 2024
A Continued Pretrained LLM Approach for Automatic Medical Note Generation
Dong Yuan
Eti Rastogi
Gautam Naik
Sree Prasanna Rajagopal
Sagar Goyal
Fen Zhao
Jai Chintagunta
Jeff Ward
LM&MA
AI4MH
32
19
0
14 Mar 2024
The Pile: An 800GB Dataset of Diverse Text for Language Modeling
Leo Gao
Stella Biderman
Sid Black
Laurence Golding
Travis Hoppe
...
Horace He
Anish Thite
Noa Nabeshima
Shawn Presser
Connor Leahy
AIMat
245
1,977
0
31 Dec 2020
PubMedQA: A Dataset for Biomedical Research Question Answering
Qiao Jin
Bhuwan Dhingra
Zhengping Liu
William W. Cohen
Xinghua Lu
202
791
0
13 Sep 2019
1