15
0

Retrieval-Augmented Generation and Knowledge-Grounded Reasoning for Faithful Patient Discharge Instructions

Abstract

Language models (LMs), such as ChatGPT, have the potential to assist clinicians in generating various clinical notes. However, LMs are prone to produce ``hallucinations'', i.e., generated content that is not aligned with facts and knowledge. In this paper, we propose the Re3^3Writer method with retrieval-augmented generation and knowledge-grounded reasoning to enable LMs to generate faithful clinical texts. We demonstrate the effectiveness of our method in generating patient discharge instructions. It requires the LMs to understand the patients' long clinical documents, i.e., the health records during hospitalization, to generate critical instructional information provided both to carers and to the patient at the time of discharge. The proposed Re3^3Writer imitates the working patterns of physicians to first retrieve related working experience from historical instructions written by physicians, then reason related medical knowledge. Finally, it refines the retrieved working experience and reasoned medical knowledge to extract useful information, which is used to generate the discharge instructions for previously-unseen patients. Our experiments show that, using our method, the performance of five different LMs can be substantially boosted across all metrics. Meanwhile, we show results from human evaluations to measure the effectiveness in terms of fluency, faithfulness, and comprehensiveness. The code is available at https://github.com/AI-in-Hospitals/Patient-Instructions

View on arXiv
Comments on this paper