Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Reproducibility and explainability in digital pathology: The need to make black-box artificial intelligence systems more transparent
2
Zitationen
3
Autoren
2024
Jahr
Abstract
Artificial intelligence (AI), and more specifically Machine Learning (ML) and Deep learning (DL), has permeated the digital pathology field in recent years, with many algorithms successfully applied as new advanced tools to analyze pathological tissues. The introduction of high-resolution scanners in histopathology services has represented a real revolution for pathologists, allowing the analysis of digital whole-slide images (WSI) on a screen without a microscope at hand. However, it means a transition from microscope to algorithms in the absence of specific training for most pathologists involved in clinical practice. The WSI approach represents a major transformation, even from a computational point of view. The multiple ML and DL tools specifically developed for WSI analysis may enhance the diagnostic process in many fields of human pathology. AI-driven models allow the achievement of more consistent results, providing valid support for detecting, from H&E-stained sections, multiple biomarkers, including microsatellite instability, that are missed by expert pathologists.
Ähnliche Arbeiten
A survey on deep learning in medical image analysis
2017 · 13.918 Zit.
pROC: an open-source package for R and S+ to analyze and compare ROC curves
2011 · 13.769 Zit.
Dermatologist-level classification of skin cancer with deep neural networks
2017 · 13.468 Zit.
A survey on Image Data Augmentation for Deep Learning
2019 · 12.061 Zit.
QuPath: Open source software for digital pathology image analysis
2017 · 8.396 Zit.