Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2406.00314
Cited By
v1
v2 (latest)
CASE: Efficient Curricular Data Pre-training for Building Assistive Psychology Expert Models
1 June 2024
Sarthak Harne
Monjoy Narayan Choudhury
Madhav Rao
T. Srikanth
Seema Mehrotra
Apoorva Vashisht
Aarushi Basu
Manjit Sodhi
AI4MH
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"CASE: Efficient Curricular Data Pre-training for Building Assistive Psychology Expert Models"
4 / 4 papers shown
Title
RoBERTa: A Robustly Optimized BERT Pretraining Approach
Yinhan Liu
Myle Ott
Naman Goyal
Jingfei Du
Mandar Joshi
Danqi Chen
Omer Levy
M. Lewis
Luke Zettlemoyer
Veselin Stoyanov
AIMat
711
24,597
0
26 Jul 2019
Publicly Available Clinical BERT Embeddings
Emily Alsentzer
John R. Murphy
Willie Boag
W. Weng
Di Jin
Tristan Naumann
Matthew B. A. McDermott
AI4MH
208
1,991
0
06 Apr 2019
SciBERT: A Pretrained Language Model for Scientific Text
Iz Beltagy
Kyle Lo
Arman Cohan
185
2,992
0
26 Mar 2019
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
Jinhyuk Lee
Wonjin Yoon
Sungdong Kim
Donghyeon Kim
Sunkyu Kim
Chan Ho So
Jaewoo Kang
OOD
195
5,688
0
25 Jan 2019
1