Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
The Clinical Integration of ChatGPT Through an Augmented Patient Encounter in a Real-World Urological Cohort: A Feasibility Study
0
Zitationen
7
Autoren
2025
Jahr
Abstract
Background/Objectives: To evaluate the viability of using ChatGPT in a real clinical environment for patient education during informed consent for flexible cystoscopy, assessing its practicality, patient perceptions, and clinician evaluations within a urological cohort. Methods: A prospective feasibility study was conducted at a single institution involving patients with haematuria who attended an in-person clinic review with access to ChatGPT-4o mini. Using predetermined prompts regarding haematuria, we evaluated the accuracy, consistency, and suitability of the ChatGPT information. Responses were appraised for errors, omission of key information, and suitability for patient education. The functionality, usability, and quality of ChatGPT for patient education were assessed by three urologists using the Patient Education Materials Assessment Tool (PEMAT) and DISCERN tools. Readability was assessed using the Flesch–Kincaid tests. Further clinician questionnaires evaluated ChatGPT’s accuracy, reproducibility, and integration potential. Results: Ten patients were recruited, but one patient was excluded because he refused to use ChatGPT due to language barriers. All patients found ChatGPT to be useful, but most believed it could not entirely replace the doctor, especially for obtaining informed consent. There were no significant errors. The mean PEMAT score for understandability was 77.8%, and actionability was 63.8%. The mean DISCERN score was 57.7, corresponding to a ‘good’ quality score. The Flesch Reading Ease score was 30.2, with the writing level comparable to US grade level 13. Conclusions: ChatGPT offers valuable support for patient education, delivering accurate and comprehensive information. However, challenges with readability, contextual understanding, and actionability highlight the need for development and careful integration. Generative artificial intelligence (AI) should augment, not replace, clinician–patient interactions, emphasising ethical considerations and patient trust. This study provides a basis for further exploration of AI’s role in healthcare.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.339 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.211 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.614 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.776 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.478 Zit.