Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Efficient Chest X-Ray Classification via Knowledge Distillation: Realizing Edge-Ready Deep Models
0
Zitationen
3
Autoren
2025
Jahr
Abstract
Deploying deep learning for chest X-ray interpretation in clinical settings faces persistent challenges, including heavy computational requirements, high latency, and limited hardware resources that restrict the use of state-of-the-art models. To address these constraints, knowledge distillation is leveraged to compress a high-capacity DenseNet121 into a streamlined ResNet18 while maintaining clinical performance. The approach is benchmarked using the NIH ChestX-ray14 dataset, targeting multi-label disease classification. Results show that the distilled student achieves a mean ROC-AUC of 0.81, exceeding the teacher’s 0.79, while reducing multiply-add operations by 36%, slashing inference latency by a factor of 4.6 (0.20 ms vs. 0.92 ms per image), and cutting peak GPU memory usage by 60% (85 MB vs. 209 MB). These outcomes confirm that efficient model compression through knowledge distillation enables rapid, reliable, and resource-friendly deployment of diagnostic AI in real-world medical environments.
Ähnliche Arbeiten
Epidemiological and clinical characteristics of 99 cases of 2019 novel coronavirus pneumonia in Wuhan, China: a descriptive study
2020 · 22.618 Zit.
La certeza de lo impredecible: Cultura Educación y Sociedad en tiempos de COVID19
2020 · 19.271 Zit.
A Multi-Modal Distributed Real-Time IoT System for Urban Traffic Control (Invited Paper)
2024 · 14.266 Zit.
UNet++: A Nested U-Net Architecture for Medical Image Segmentation
2018 · 8.577 Zit.
Review of deep learning: concepts, CNN architectures, challenges, applications, future directions
2021 · 7.194 Zit.