ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2305.01146
34
25

RadAdapt: Radiology Report Summarization via Lightweight Domain Adaptation of Large Language Models

2 May 2023
Dave Van Veen
Cara Van Uden
Maayane Attias
Anuj Pareek
Christian Blüthgen
M. Polacin
Wah Chiu
Jean-Benoit Delbrouck
Juan Manuel Zambrano Chaves
C. Langlotz
Akshay S. Chaudhari
John M. Pauly
    LM&MA
ArXivPDFHTML
Abstract

We systematically investigate lightweight strategies to adapt large language models (LLMs) for the task of radiology report summarization (RRS). Specifically, we focus on domain adaptation via pretraining (on natural language, biomedical text, or clinical text) and via discrete prompting or parameter-efficient fine-tuning. Our results consistently achieve best performance by maximally adapting to the task via pretraining on clinical text and fine-tuning on RRS examples. Importantly, this method fine-tunes a mere 0.32% of parameters throughout the model, in contrast to end-to-end fine-tuning (100% of parameters). Additionally, we study the effect of in-context examples and out-of-distribution (OOD) training before concluding with a radiologist reader study and qualitative analysis. Our findings highlight the importance of domain adaptation in RRS and provide valuable insights toward developing effective natural language processing solutions for clinical tasks.

View on arXiv
Comments on this paper