Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
XGBoost-based model for predicting PICC occlusion risk in cancer patients: Insights from SHAP analysis
3
Zitationen
8
Autoren
2025
Jahr
Abstract
Peripherally inserted central catheters (PICC) are commonly used in cancer patients, but occlusion is a frequent complication. Early prediction of the occlusion risk can guide timely interventions and improve patient outcomes. This study develops and validates a machine-learning model to predict the PICC occlusion risk in cancer patients using clinical data from electronic medical records. In this retrospective, single-center study, data from cancer patients with PICC lines were analyzed. Three machine learning algorithms—logistic regression, random forest, and XGBoost—were used to predict the occlusion risk. Model performance was evaluated by the area under the receiver operating characteristic curve (AUC). Key risk factors, including patient demographics, clinical conditions, and catheter maintenance practices, were incorporated. XGBoost outperformed the other models, achieving AUC values of 0.909 in the training cohort and 0.759 in the validation cohort. Key predictors of PICC occlusion included catheter duration, electrolyte disturbances, the chemotherapy drug type, and the insertion length. SHAP analysis provided transparent model interpretation. The XGBoost model effectively predicts the PICC occlusion risk and identifies key predictors. While limited by its retrospective design, the study suggests the potential for clinical integration to improve patient outcomes. Further prospective studies are needed. Developed a machine learning model to predict PICC occlusion risk in cancer patients. XGBoost model demonstrated high predictive accuracy (AUC 0.909 in training, 0.759 in validation). Identified key predictors of PICC occlusion: catheter duration, chemotherapy regimen, electrolyte disturbances. SHAP analysis provided transparent insights into feature importance, aiding clinical decision-making. Statistical validation confirmed the significance of key risk factors, supporting personalized patient management.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.336 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.207 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.607 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.776 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.476 Zit.