Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
“This ground truth is muddy anyway”
0
Zitationen
1
Autoren
2025
Jahr
Abstract
This article explores assemblages of ground truth datasets for the development of medical artificial intelligence (AI). By drawing from interviews and observations, I examine how AI experts developing medical AI relate to the referential truth basis of their work, their ground truths, as an epistemic concern. By addressing how datasets are assembled from different sources, and produced, augmented and synthesised, this study shows how ground truths are valued based on humanness, quality of medical expert judgements, temporality and technical qualities. Moreover, this article analyses truth practices as productive moments in AI development, the role of human expertise and the perceived strengths and limits of expert-based annotations. The valuations of ground truths shatter the image of medical classifications, and AI models, as stable neutral entities. Moreover, this article shows how valuations of ground truths encompass more than alignment with standardised expertise. To better understand the possibilities for medical AI to live up to ideals of accuracy, fairness, trustworthiness and transparency, we need moreknowledge on assumptions, negotiations and epistemic concerns upon which medical AI is built.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.312 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.169 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.564 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.776 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.466 Zit.