Adapted large language models can outperform medical experts in clinical text summarization
This is a virtual seminar. For a Zoom link, please see "Venue details". Please consider subscribing to mailing list:
Analyzing vast textual data and summarizing key information from electronic health records imposes a substantial burden on how clinicians allocate their time. Although large language models (LLMs) have shown promise in natural language processing (NLP), their effectiveness on a diverse range of clinical summarization tasks remains unproven. In this study, we apply adaptation methods to eight LLMs, spanning four distinct clinical summarization tasks: radiology reports, patient questions, progress notes, and doctor-patient dialogue. Quantitative assessments with syntactic, semantic, and conceptual NLP metrics reveal trade-offs between models and adaptation methods. A clinical reader study with ten physicians evaluates summary completeness, correctness, and conciseness; in a majority of cases, summaries from our best adapted LLMs are either equivalent (45%) or superior (36%) compared to summaries from medical experts. The ensuing safety analysis highlights challenges faced by both LLMs and medical experts, as we connect errors to potential medical harm and categorize types of fabricated information. Our research provides evidence of LLMs outperforming medical experts in clinical text summarization across multiple tasks. This suggests that integrating LLMs into clinical workflows could alleviate documentation burden, allowing clinicians to focus more on patient care.
Date: 23 April 2024, 15:00 (Tuesday, 1st week, Trinity 2024)
Speaker: Dave Van Veen (Stanford University)
Organising department: Department of Psychiatry
Organiser: Dr Andrey Kormilitzin (University of Oxford)
Organiser contact email address:
Host: Dr Andrey Kormilitzin (University of Oxford)
Part of: Artificial Intelligence for Mental Health Seminar Series
Booking required?: Not required
Booking url:
Booking email:
Audience: Public
Editor: Andrey Kormilitzin