Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Federated learning with differential privacy for breast cancer diagnosis enabling secure data sharing and model integrity
41
Zitationen
6
Autoren
2025
Jahr
Abstract
In the digital age, privacy preservation is of paramount importance while processing health-related sensitive information. This paper explores the integration of Federated Learning (FL) and Differential Privacy (DP) for breast cancer detection, leveraging FL's decentralized architecture to enable collaborative model training across healthcare organizations without exposing raw patient data. To enhance privacy, DP injects statistical noise into the updates made by the model. This mitigates adversarial attacks and prevents data leakage. The proposed work uses the Breast Cancer Wisconsin Diagnostic dataset to address critical challenges such as data heterogeneity, privacy-accuracy trade-offs, and computational overhead. From the experimental results, FL combined with DP achieves 96.1% accuracy with a privacy budget of ε = 1.9, ensuring strong privacy preservation with minimal performance trade-offs. In comparison, the traditional non-FL model achieved 96.0% accuracy, but at the cost of requiring centralized data storage, which poses significant privacy risks. These findings validate the feasibility of privacy-preserving artificial intelligence models in real-world clinical applications, effectively balancing data protection with reliable medical predictions.
Ähnliche Arbeiten
k-ANONYMITY: A MODEL FOR PROTECTING PRIVACY
2002 · 8.402 Zit.
Calibrating Noise to Sensitivity in Private Data Analysis
2006 · 6.894 Zit.
Deep Learning with Differential Privacy
2016 · 5.627 Zit.
Communication-Efficient Learning of Deep Networks from Decentralized\n Data
2016 · 5.595 Zit.
Federated Machine Learning
2019 · 5.579 Zit.