OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 28.03.2026, 01:21

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Unsupervised pre-training of graph transformers on patient population\n graphs

2022·0 Zitationen·arXiv (Cornell University)Open Access
Volltext beim Verlag öffnen

0

Zitationen

3

Autoren

2022

Jahr

Abstract

Pre-training has shown success in different areas of machine learning, such\nas Computer Vision, Natural Language Processing (NLP), and medical imaging.\nHowever, it has not been fully explored for clinical data analysis. An immense\namount of clinical records are recorded, but still, data and labels can be\nscarce for data collected in small hospitals or dealing with rare diseases. In\nsuch scenarios, pre-training on a larger set of unlabelled clinical data could\nimprove performance. In this paper, we propose novel unsupervised pre-training\ntechniques designed for heterogeneous, multi-modal clinical data for patient\noutcome prediction inspired by masked language modeling (MLM), by leveraging\ngraph deep learning over population graphs. To this end, we further propose a\ngraph-transformer-based network, designed to handle heterogeneous clinical\ndata. By combining masking-based pre-training with a transformer-based network,\nwe translate the success of masking-based pre-training in other domains to\nheterogeneous clinical data. We show the benefit of our pre-training method in\na self-supervised and a transfer learning setting, utilizing three medical\ndatasets TADPOLE, MIMIC-III, and a Sepsis Prediction Dataset. We find that our\nproposed pre-training methods help in modeling the data at a patient and\npopulation level and improve performance in different fine-tuning tasks on all\ndatasets.\n

Ähnliche Arbeiten

Autoren

Themen

Machine Learning in HealthcareTopic ModelingArtificial Intelligence in Healthcare and Education
Volltext beim Verlag öffnen