Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
E-117 Expanding horizons: Viz.AI’s versatile role in neurological care
0
Zitationen
11
Autoren
2024
Jahr
Abstract
<h3>Objective</h3> Viz.AI is an artificial intelligence program designed to screen patients with suspected large vessel occlusions (LVO) and other cerebrovascular emergencies. This study aims to compare the sensitivity and specificity of Viz.AI with the diagnostic skills of radiologists at a Comprehensive Stroke Center. Additionally, an analysis of factors contributing to Viz.AI’s predictions was conducted. <h3>Methods</h3> Scans were processed through Viz.AI’s LVO detection platform from October 23, 2022, to September 10, 2023. Radiology reads of all scans were reviewed using the electronic medical record system, EPIC. Data were analyzed using Prism 9.0. <h3>Results</h3> Among the 1,409 scans recorded, 238 were flagged as LVO by Viz.AI, with 143 confirmed as LVO by radiologists, resulting in 87 cases receiving endovascular treatment. Of the 95 LVO cases flagged by Viz.AI not confirmed by review, 43 (45%) required a Neurosurgery consult for other cerebrovascular pathology (moyamoya, hematoma, hemorrhage, encephalopathy, prior stroke, stenosis, occlusion of a medium-sized vessel, ICAD, and cysts). Statistical analysis revealed a sensitivity of 71.86% (95% CI 65.06–77.99), specificity of 92.12% (95% CI 90.46–93.58), positive predictive value of 60.08% (95% CI 54.92–65.04), and a negative predictive value of 95.20% (95% CI 94.08–96.12). <h3>Conclusions</h3> The study underscores the wide-ranging impact of Viz.AI’s AI technology in revolutionizing critical care. Its LVO detection platform, functioning as a versatile tool, demonstrates sensitivity beyond large vessel occlusions, extending to various cerebrovascular emergencies and other pathological conditions. Recognizing this versatility and addressing limitations can enable healthcare providers to fully utilize Viz.AI as a vital tool in diagnosing and managing neurological disorders. <h3>Disclosures</h3> <b>E. Paulin:</b> None. <b>D. Williams-Stankewicz:</b> None. <b>J. Frank:</b> None. <b>N. Millson:</b> None. <b>M. Campbell:</b> None. <b>L. Wise:</b> None. <b>D. Lukins:</b> None. <b>M. Al-Kawaz:</b> None. <b>D. Dornbos:</b> None. <b>S. Pahwa:</b> None. <b>J. Fraser:</b> None.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.545 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.436 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.935 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.781 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.589 Zit.