Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2305.03788
Cited By
Harnessing the Power of BERT in the Turkish Clinical Domain: Pretraining Approaches for Limited Data Scenarios
5 May 2023
Hazal Türkmen
Oğuz Dikenelli
C. Eraslan
Mehmet Cem Çalli
S. Özbek
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Harnessing the Power of BERT in the Turkish Clinical Domain: Pretraining Approaches for Limited Data Scenarios"
2 / 2 papers shown
Title
Investigating Large Language Models and Control Mechanisms to Improve Text Readability of Biomedical Abstracts
Z. Li
Samuel Belkadi
Nicolo Micheletti
Lifeng Han
Matthew Shardlow
Goran Nenadic
28
5
0
22 Sep 2023
Training language models to follow instructions with human feedback
Long Ouyang
Jeff Wu
Xu Jiang
Diogo Almeida
Carroll L. Wainwright
...
Amanda Askell
Peter Welinder
Paul Christiano
Jan Leike
Ryan J. Lowe
OSLM
ALM
308
11,909
0
04 Mar 2022
1