ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2406.00314
  4. Cited By
CASE: Efficient Curricular Data Pre-training for Building Assistive
  Psychology Expert Models
v1v2 (latest)

CASE: Efficient Curricular Data Pre-training for Building Assistive Psychology Expert Models

1 June 2024
Sarthak Harne
Monjoy Narayan Choudhury
Madhav Rao
T. Srikanth
Seema Mehrotra
Apoorva Vashisht
Aarushi Basu
Manjit Sodhi
    AI4MH
ArXiv (abs)PDFHTML

Papers citing "CASE: Efficient Curricular Data Pre-training for Building Assistive Psychology Expert Models"

4 / 4 papers shown
Title
RoBERTa: A Robustly Optimized BERT Pretraining Approach
RoBERTa: A Robustly Optimized BERT Pretraining Approach
Yinhan Liu
Myle Ott
Naman Goyal
Jingfei Du
Mandar Joshi
Danqi Chen
Omer Levy
M. Lewis
Luke Zettlemoyer
Veselin Stoyanov
AIMat
711
24,597
0
26 Jul 2019
Publicly Available Clinical BERT Embeddings
Publicly Available Clinical BERT Embeddings
Emily Alsentzer
John R. Murphy
Willie Boag
W. Weng
Di Jin
Tristan Naumann
Matthew B. A. McDermott
AI4MH
208
1,991
0
06 Apr 2019
SciBERT: A Pretrained Language Model for Scientific Text
SciBERT: A Pretrained Language Model for Scientific Text
Iz Beltagy
Kyle Lo
Arman Cohan
185
2,992
0
26 Mar 2019
BioBERT: a pre-trained biomedical language representation model for
  biomedical text mining
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
Jinhyuk Lee
Wonjin Yoon
Sungdong Kim
Donghyeon Kim
Sunkyu Kim
Chan Ho So
Jaewoo Kang
OOD
195
5,688
0
25 Jan 2019
1