Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
PRICURE: Privacy-Preserving Collaborative Inference in a Multi-Party\n Setting
0
Zitationen
2
Autoren
2021
Jahr
Abstract
When multiple parties that deal with private data aim for a collaborative\nprediction task such as medical image classification, they are often\nconstrained by data protection regulations and lack of trust among\ncollaborating parties. If done in a privacy-preserving manner, predictive\nanalytics can benefit from the collective prediction capability of multiple\nparties holding complementary datasets on the same machine learning task. This\npaper presents PRICURE, a system that combines complementary strengths of\nsecure multi-party computation (SMPC) and differential privacy (DP) to enable\nprivacy-preserving collaborative prediction among multiple model owners. SMPC\nenables secret-sharing of private models and client inputs with non-colluding\nsecure servers to compute predictions without leaking model parameters and\ninputs. DP masks true prediction results via noisy aggregation so as to deter a\nsemi-honest client who may mount membership inference attacks. We evaluate\nPRICURE on neural networks across four datasets including benchmark medical\nimage classification datasets. Our results suggest PRICURE guarantees privacy\nfor tens of model owners and clients with acceptable accuracy loss. We also\nshow that DP reduces membership inference attack exposure without hurting\naccuracy.\n
Ähnliche Arbeiten
k-ANONYMITY: A MODEL FOR PROTECTING PRIVACY
2002 · 8.402 Zit.
Calibrating Noise to Sensitivity in Private Data Analysis
2006 · 6.898 Zit.
Deep Learning with Differential Privacy
2016 · 5.629 Zit.
Communication-Efficient Learning of Deep Networks from Decentralized\n Data
2016 · 5.595 Zit.
Federated Machine Learning
2019 · 5.588 Zit.