eAssessment: development of an electronic version of the Objective Structured Assessment of Debriefing tool to streamline evaluation of video recorded debriefings

J Am Med Inform Assoc. 2018 Oct 1;25(10):1284-1291. doi: 10.1093/jamia/ocy113.

Abstract

Objective: The Objective Structured Assessment of Debriefing (OSAD) is an evidence-based, 8-item tool that uses a behaviorally anchored rating scale in paper-based form to evaluate the quality of debriefing in medical education. The objective of this project was twofold: 1) to create an easy-to-use electronic format of the OSAD (eOSAD) in order to streamline data entry; and 2) to pilot its use on videoed debriefings.

Materials and methods: The eOSAD was developed in collaboration with the LSU Health New Orleans Epidemiology Data Center using SurveyGizmo (Widgix Software, LLC, Boulder, CO, USA) software. The eOSAD was then piloted by 2 trained evaluators who rated 37 videos of faculty teams conducting pre-briefing and debriefing after a high-fidelity trauma simulation. Inter-rater reliability was assessed, and evaluators' qualitative feedback was obtained.

Results: Inter-rater reliability was good [prebrief, intraclass correlation coefficient, ICC = 0.955 (95% CI, 0.912-0.977), P < .001; debrief, ICC = 0.853 (95% CI, 0.713-0.924), P < .001]. Qualitative feedback from evaluators found that the eOSAD was easy to complete, simple to read and add comments, and reliably stored data that were readily retrievable, enabling the smooth dissemination of information collected.

Discussion: The eOSAD features a secure login, sharable internet access link for distant evaluators, and the immediate exporting of data into a secure database for future analysis. It provided convenience for end-users, produced reliable assessments among independent evaluators, and eliminated multiple sources of possible data corruption.

Conclusion: The eOSAD tool format advances the post debriefing evaluation of videoed inter-professional team training in high-fidelity simulation.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Clinical Competence
  • Education, Medical*
  • Educational Measurement / methods
  • Feedback*
  • High Fidelity Simulation Training*
  • Humans
  • User-Computer Interface
  • Video Recording*