Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Clinician Perspectives on Decision Support and AI-based Decision Support in a Pediatric ED
5
Zitationen
6
Autoren
2024
Jahr
Abstract
BACKGROUND: Clinical decision support (CDS) systems offer the potential to improve pediatric care through enhanced test ordering, prescribing, and standardization of care. Its augmentation with artificial intelligence (AI-CDS) may help address current limitations with CDS implementation regarding alarm fatigue and accuracy of recommendations. We sought to evaluate strengths and perceptions of CDS, with a focus on AI-CDS, through semistructured interviews of clinician partners. METHODS: We conducted a qualitative study using semistructured interviews of physicians, nurse practitioners, and nurses at a single quaternary-care pediatric emergency department to evaluate clinician perceptions of CDS and AI-CDS. We used reflexive thematic analysis to identify themes and purposive sampling to complete recruitment with the goal of reaching theoretical sufficiency. RESULTS: We interviewed 20 clinicians. Participants demonstrated a variable understanding of CDS and AI, with some lacking a clear definition. Most recognized the potential benefits of AI-CDS in clinical contexts, such as data summarization and interpretation. Identified themes included the potential of AI-CDS to improve diagnostic accuracy, standardize care, and improve efficiency, while also providing educational benefits to clinicians. Participants raised concerns about the ability of AI-based tools to appreciate nuanced pediatric care, accurately interpret data, and about tensions between AI recommendations and clinician autonomy. CONCLUSIONS: AI-CDS tools have a promising role in pediatric emergency medicine but require careful integration to address clinicians' concerns about autonomy, nuance recognition, and interpretability. A collaborative approach to development and implementation, informed by clinicians' insights and perspectives, will be pivotal for their successful adoption and efficacy in improving patient care.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.560 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.451 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.948 Zit.
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
2019 · 6.797 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.781 Zit.