Background: Oral practice examinations (OPEs) are used in many anesthesiology programs to familiarize residents with the format of the oral qualifying examination given by the American Board of Anesthesiology (ABA). The purpose of this communication is to describe the planning, structure, startup, administration, growth and evaluation of a comprehensive oral examination program at a sizeable residency program in the Midwest.
Methods and results: A committee of three experienced faculty was formed to plan the effort. Planning involved consideration of format and frequency of administration, timing for best resident and faculty availability, communication, forms design, clerical support, record keeping and quality monitoring. To accommodate resident rotation and faculty work schedules, a semiannual administration schedule on 3-4 consecutive Mondays was chosen. The mock oral format was deliberately constructed to resemble that used by the ABA so as to enhance resident familiarity and comfort with ABA style oral exams. Continued quality improvement tools put in place consisted of regular examiner and examinee inservice sessions, communication and feedback from ABA associate examiners to faculty examiners as well as review of examinee exit questionnaires. A set of OPE databases were constructed so as to facilitate quality monitoring and educational research efforts. Continued administration of the OPE program required ongoing construction of a pool of guided case-oriented questions, selection of appropriate questions based on examinee training exposure, advance publication of the exam calendar and scheduling of recurring examiner and examinee activities. Significant issues which required action by the governing committee were exam timing, avoidance of conflict with clinical demands, use of OPE results, and procurement of training resources. Despite initial skepticism, the OPE program was begun successfully and grew substantially from 56 exams in the first year to 120 exams by year three. The OPE was perceived positively by the majority of residents. 90.2% of exit questionnaires acknowledged specific learning about oral exam technique, while only 0.3% indicated lack of meaningful information exchange at OPE sessions. Fewer than 10% of responses indicated misleading questions or badgering by examiners. Although anxiety remained constant over time, resident preparedness increased with repeat OPE exposure.
Summary: A comprehensive mock oral examination of substantial scope was successfully planned, initiated, and developed at our anesthesiology resident training program. It is well accepted by residents and faculty. Its inception was associated with an increase in resident preparedness. Now in its tenth year of existence it continues to be an asset and essential component of our training program.
Keywords: Education; anesthesiologists; board certification; oral examinations; residents.