How Does Learnability of Primary Care Resident Physicians Increase After Seven Months of Using an Electronic Health Record? A Longitudinal Study

JMIR Hum Factors. 2016 Feb 15;3(1):e9. doi: 10.2196/humanfactors.4601.

Abstract

Background: Electronic health records (EHRs) with poor usability present steep learning curves for new resident physicians, who are already overwhelmed in learning a new specialty. This may lead to error-prone use of EHRs in medical practice by new resident physicians.

Objective: The study goal was to determine learnability gaps between expert and novice primary care resident physician groups by comparing performance measures when using EHRs.

Methods: We compared performance measures after two rounds of learnability tests (November 12, 2013 to December 19, 2013; February 12, 2014 to April 22, 2014). In Rounds 1 and 2, 10 novice and 6 expert physicians, and 8 novice and 4 expert physicians participated, respectively. Laboratory-based learnability tests using video analyses were conducted to analyze learnability gaps between novice and expert physicians. Physicians completed 19 tasks, using a think-aloud strategy, based on an artificial but typical patient visit note. We used quantitative performance measures (percent task success, time-on-task, mouse activities), a system usability scale (SUS), and qualitative narrative feedback during the participant debriefing session.

Results: There was a 6-percentage-point increase in novice physicians' task success rate (Round 1: 92%, 95% CI 87-99; Round 2: 98%, 95% CI 95-100) and a 7-percentage-point increase in expert physicians' task success rate (Round 1: 90%, 95% CI 83-97; Round 2: 97%, 95% CI 93-100); a 10% decrease in novice physicians' time-on-task (Round 1: 44s, 95% CI 32-62; Round 2: 40s, 95% CI 27-59) and 21% decrease in expert physicians' time-on-task (Round 1: 39s, 95% CI 29-51; Round 2: 31s, 95% CI 22-42); a 20% decrease in novice physicians mouse clicks (Round 1: 8 clicks, 95% CI 6-13; Round 2: 7 clicks, 95% CI 4-12) and 39% decrease in expert physicians' mouse clicks (Round 1: 8 clicks, 95% CI 5-11; Round 2: 3 clicks, 95% CI 1-10); a 14% increase in novice mouse movements (Round 1: 9247 pixels, 95% CI 6404-13,353; Round 2: 7991 pixels, 95% CI 5350-11,936) and 14% decrease in expert physicians' mouse movements (Round 1: 7325 pixels, 95% CI 5237-10,247; Round 2: 6329 pixels, 95% CI 4299-9317). The SUS measure of overall usability demonstrated only minimal change in the novice group (Round 1: 69, high marginal; Round 2: 68, high marginal) and no change in the expert group (74; high marginal for both rounds).

Conclusions: This study found differences in novice and expert physicians' performance, demonstrating that physicians' proficiency increased with EHR experience. Our study may serve as a guideline to improve current EHR training programs. Future directions include identifying usability issues faced by physicians when using EHRs, through a more granular task analysis to recognize subtle usability issues that would otherwise be overlooked.

Keywords: primary care, physicians, usability, electronic health records, computerized physician order entry, user-computer interface.