Introduction: This study examines how improving curricular content and technical interface issues could make maintenance of certification activities more meaningful to American Board of Family Medicine diplomates completing Maintenance of Certification for Family Physicians (MC-FP) Program self-assessment modules (SAMs).
Methods: We used a sequential exploratory design to analyze quantitative and qualitative data from 320,500 surveys of family physicians who completed a SAM between January 2004 and April 2013. This included numeric rating scales and free text comments. Basic statistical rankings, template-based automated coding, and emergent coding were used to analyze SAM experience and identify thematic content.
Results: Across SAMs, numeric ratings were universally high and positive free text comments outnumbered negative comments two to one. When comparing feedback on the knowledge assessment and clinical simulation (CS) activities, SAMs were rated less favorably when the frequency of ideas identified by participants as most prevalent in one activity mismatched those identified as most prevalent in the companion activity. Participants were also critical of navigation issues, technical issues, and a lack of realness in the CS activity.
Discussion: Whether analyzed through quantitative data, qualitative data, or mixed methods, a large majority of participants rated their experience with SAMs highly. When individual SAMs were rated poorly, it seemed to be due to discordance in ideas emphasized between the knowledge assessment and CS components, or an opinion regarding the SAM topic that existed independent of the SAM process.