OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 01.04.2026, 20:04

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Efficient Chest X-Ray Classification via Knowledge Distillation: Realizing Edge-Ready Deep Models

2025·0 Zitationen
Volltext beim Verlag öffnen

0

Zitationen

3

Autoren

2025

Jahr

Abstract

Deploying deep learning for chest X-ray interpretation in clinical settings faces persistent challenges, including heavy computational requirements, high latency, and limited hardware resources that restrict the use of state-of-the-art models. To address these constraints, knowledge distillation is leveraged to compress a high-capacity DenseNet121 into a streamlined ResNet18 while maintaining clinical performance. The approach is benchmarked using the NIH ChestX-ray14 dataset, targeting multi-label disease classification. Results show that the distilled student achieves a mean ROC-AUC of 0.81, exceeding the teacher’s 0.79, while reducing multiply-add operations by 36%, slashing inference latency by a factor of 4.6 (0.20 ms vs. 0.92 ms per image), and cutting peak GPU memory usage by 60% (85 MB vs. 209 MB). These outcomes confirm that efficient model compression through knowledge distillation enables rapid, reliable, and resource-friendly deployment of diagnostic AI in real-world medical environments.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

COVID-19 diagnosis using AIAI in cancer detectionArtificial Intelligence in Healthcare and Education
Volltext beim Verlag öffnen