OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 29.03.2026, 12:17

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Inclusive Federated Learning for Healthcare Collaboration: Development and Evaluation of a Compliance-Aware Differential Privacy Framework (Preprint)

2025·0 ZitationenOpen Access
Volltext beim Verlag öffnen

0

Zitationen

9

Autoren

2025

Jahr

Abstract

<sec> <title>BACKGROUND</title> Federated learning (FL) enables collaborative training of clinical AI models without centralizing sensitive patient data. Despite its promise, adoption in healthcare is limited by privacy concerns, heterogeneous institutional compliance, and resource disparities. In standard differential privacy (DP) methods, DP noise is applied in equal amounts for each client, which can disproportionately penalize well-compliant or under-resourced institutions, reducing model performance. </sec> <sec> <title>OBJECTIVE</title> This study introduces a compliance-aware FL framework that adapts DP mechanisms based on institutional compliance. Our goals were to (1) design a compliance scoring tool aligned with healthcare and security standards, (2) integrate adaptive DP noise into FL, and (3) evaluate its effectiveness in balancing privacy, compliance, and accuracy. </sec> <sec> <title>METHODS</title> We developed a compliance scoring system to quantify institutional adherence to key standards and used these scores to guide adaptive DP noise allocation. Higher-compliance institutions received proportionally less noise, while global privacy guarantees were maintained. Experiments on public clinical datasets compared the proposed framework against traditional FL with uniform DP, assessing accuracy, fairness, and robustness across varying compliance distributions. </sec> <sec> <title>RESULTS</title> The compliance-aware FL framework outperformed standard approaches, yielding up to 15% higher accuracy when combining data from both highly regulated and under-resourced institutions. Adaptive noise allocation reduced performance degradation in compliant sites and promoted equitable participation. The compliance scoring tool provided a transparent measure of readiness, supporting secure and inclusive collaboration. </sec> <sec> <title>CONCLUSIONS</title> Incorporating compliance into FL offers a practical pathway for advancing privacy-preserving AI in healthcare. Adaptive DP improves accuracy and fairness while maintaining privacy guarantees, making FL more viable for diverse clinical settings. This framework bridges theoretical privacy methods with real-world healthcare needs. </sec>

Ähnliche Arbeiten

Autoren

Themen

Privacy-Preserving Technologies in DataMachine Learning in HealthcareArtificial Intelligence in Healthcare and Education
Volltext beim Verlag öffnen