Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Unsupervised pre-training of graph transformers on patient population\n graphs
0
Zitationen
3
Autoren
2022
Jahr
Abstract
Pre-training has shown success in different areas of machine learning, such\nas Computer Vision, Natural Language Processing (NLP), and medical imaging.\nHowever, it has not been fully explored for clinical data analysis. An immense\namount of clinical records are recorded, but still, data and labels can be\nscarce for data collected in small hospitals or dealing with rare diseases. In\nsuch scenarios, pre-training on a larger set of unlabelled clinical data could\nimprove performance. In this paper, we propose novel unsupervised pre-training\ntechniques designed for heterogeneous, multi-modal clinical data for patient\noutcome prediction inspired by masked language modeling (MLM), by leveraging\ngraph deep learning over population graphs. To this end, we further propose a\ngraph-transformer-based network, designed to handle heterogeneous clinical\ndata. By combining masking-based pre-training with a transformer-based network,\nwe translate the success of masking-based pre-training in other domains to\nheterogeneous clinical data. We show the benefit of our pre-training method in\na self-supervised and a transfer learning setting, utilizing three medical\ndatasets TADPOLE, MIMIC-III, and a Sepsis Prediction Dataset. We find that our\nproposed pre-training methods help in modeling the data at a patient and\npopulation level and improve performance in different fine-tuning tasks on all\ndatasets.\n
Ähnliche Arbeiten
"Why Should I Trust You?"
2016 · 14.294 Zit.
A Comprehensive Survey on Graph Neural Networks
2020 · 8.666 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.189 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.588 Zit.
Artificial intelligence in healthcare: past, present and future
2017 · 4.405 Zit.