Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
A unified method to revoke the private data of patients in intelligent healthcare with audit to forget
13
Zitationen
8
Autoren
2023
Jahr
Abstract
Revoking personal private data is one of the basic human rights. However, such right is often overlooked or infringed upon due to the increasing collection and use of patient data for model training. In order to secure patients' right to be forgotten, we proposed a solution by using auditing to guide the forgetting process, where auditing means determining whether a dataset has been used to train the model and forgetting requires the information of a query dataset to be forgotten from the target model. We unified these two tasks by introducing an approach called knowledge purification. To implement our solution, we developed an audit to forget software (AFS), which is able to evaluate and revoke patients' private data from pre-trained deep learning models. Here, we show the usability of AFS and its application potential in real-world intelligent healthcare to enhance privacy protection and data revocation rights.
Ähnliche Arbeiten
k-ANONYMITY: A MODEL FOR PROTECTING PRIVACY
2002 · 8.402 Zit.
Calibrating Noise to Sensitivity in Private Data Analysis
2006 · 6.892 Zit.
Deep Learning with Differential Privacy
2016 · 5.620 Zit.
Communication-Efficient Learning of Deep Networks from Decentralized\n Data
2016 · 5.594 Zit.
Federated Machine Learning
2019 · 5.574 Zit.