Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Foundation Deep Learning Models For Precision Medicine Using Multimodal Big Data
0
Zitationen
1
Autoren
2026
Jahr
Abstract
Deep learning models pretrained on multimodal Big Data may be used to address many problems in Precision Medicine. Choi et al. (2022) provided background on the core model architectures and training paradigms; on the prevalent types of multimodal medical data; and on the preliminary works aimed at combining model frameworks and training paradigm across existing modalities. Multimodal Foundation Models are introduced on the basis of learned medical representation and multimodal representation transfer and design. The explored areas are centered on data ecosystems that can support the future development of Deep Learning within Precision Medicine using multimodal Big Data. As subtitled, the section on Data Ecosystems and Governance focuses on data acquisition, curation and quality assurance, privacy, security and ethical considerations. Applications involve the integration of genomics and transcriptomics; the use of medical image and radiomic data; and the building–testing of benchmark datasets to alleviate data bias. The addressed methodological challenges discuss data bias, fairness and generalizability; interpretability and clinician trust; benchmarking protocols; reproducibility; and open science practices.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.324 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.189 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.588 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.776 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.470 Zit.