OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 05.05.2026, 06:24

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

A Log-Level Data-Driven Precision Education Tool for Pediatrics Trainees: Human-Centered Development and Validation Study

2026·0 Zitationen·JMIR Human FactorsOpen Access
Volltext beim Verlag öffnen

0

Zitationen

4

Autoren

2026

Jahr

Abstract

Background: Exposure to patients and clinical diagnoses drives learning in graduate medical education (GME). Measuring practice data, how each trainee experiences that exposure, is critical to planned learning processes including the assessment of trainee needs. We previously developed and validated an automated system to accurately identify resident provider-patient interactions. Objective: In this follow-up study, we use human-centered design methods to meet two objectives: (1) to understand trainees' planned learning needs and (2) to design, build, and validate the usability and use of a tool based on our automated resident provider-patient interaction system to meet these needs. Methods: We collected data from 2 institutions new to the American Medical Association's "Advancing Change" initiative, using a mixed methods approach with purposive sampling. First, interviews and formative prototype testing yielded qualitative data that we analyzed in several coding cycles. We built interview guides to collect data required for a work domain assessment, learning use case elicitation, and ultimately design requirement identification. We structured coding efforts within 2 existing theoretical models. Feature prioritization matrix analysis then transformed qualitative analysis outputs into actionable prototype elements that were refined through formative usability methods. Finally, qualitative data from a summative usability test validated the final prototype with measures of usefulness, usability, and intent to use. We used quantitative methods (eg, time on task and task completion rate in summative testing). Results: We represented the GME work domain assessment through process-map-design artifacts that provide target opportunities for intervention. Of the identified decision-making opportunities, trainee-mentor meetings stood out as optimal for delivering reliable practice-area information. We designed a "midpoint" report for the use case of such meetings. We arrived at a final prototype through formative testing and design iteration. This final version showed 5 essential visualizations. Summative usability testing resulted in high performance in subjective and objective metrics. Insufficient baseline data were captured to draw comparative conclusions in a formal evaluation against existing tools or workarounds to support planned learning. However, the prevailing reported absence of tools and the ad hoc nature of approaches that do exist strongly imply an unmet need for the type of usable summary method delivered in our tool. We collected data from June 2021 through September 2023. Eight resident physicians composed the validation sample, including 4 (50%) residents from the Children's Hospital of Philadelphia and 4 (50%) residents from the University of Rochester Medical Center. Conclusions: We describe the multisite development of a tool providing visualizations of log-level electronic health record data, using human-centered design methods. Delivered at an identified point in GME, the tool is ideal for fostering the development of master adaptive learners. The resulting prototype is validated with high performance on a summative usability test. Additionally, the design, development, and assessment process may be applied to other tools and topics within clinical informatics.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Innovations in Medical EducationSimulation-Based Education in HealthcareArtificial Intelligence in Healthcare and Education
Volltext beim Verlag öffnen