BACKGROUND: With the integration of Objective Structured Clinical Examinations into the Anesthesiology primary board certification process, residency programs may choose to implement Objective Structured Clinical Examinations for resident skill assessment. The aim of this study was to evaluate Objective Structured Clinical Examination-based milestone assessment and compare with Clinical Competency Committee milestone assessment that is based purely on clinical evaluations. METHODS: An annual Objective Structured Clinical Examination event was used to obtain milestone assessment of clinical anesthesia year 0-clinical anesthesia year 3 residents for selected milestones in patient care, professionalism, and interpersonal/communication skills. The Objective Structured Clinical Examination scenarios were different for each training level. The Clinical Competency Committee evaluated each resident semiannually based on clinical evaluations of resident performance. The Clinical Competency Committee milestone assessments from 2014 to 2016 that were recorded closest to the Objective Structured Clinical Examination event (±3 months) were compared to the Objective Structured Clinical Examination milestone assessments. A total of 35 residents were included in this analysis in 3 different training cohorts: A (graduates 2016, n = 12); B (graduates 2017, n = 10); and C (graduates 2018, n = 13). All residents participated in Objective Structured Clinical Examinations because their clinical anesthesia year 0 year and Clinical Competency Committee milestone data had been reported since December 2014. RESULTS: Both assessment techniques indicated a competency growth proportional to the length in training. Despite limited cumulative statistics in this study, average trends in the Objective Structured Clinical Examination-Clinical Competency Committee relationship indicated: (1) a good proportionality in reflecting competency growth; (2) a grade enhancement associated with Clinical Competency Committee assessment, dominated by evaluations of junior residents (clinical anesthesia year 0-clinical anesthesia year 1); and (3) an expectation bias in Clinical Competency Committee assessment, dominated by evaluation of senior residents (clinical anesthesia year 2-clinical anesthesia year 3). CONCLUSIONS: Our analysis confirms the compatibility of the 2 evaluation methods in reflecting longitudinal growth. The deviation of Objective Structured Clinical Examination assessments versus Clinical Competency Committee assessments suggests that Objective Structured Clinical Examinations may be providing additional or different information on resident performance. Educators might consider using both assessment methods to provide the most reliable and valid competency assessments during residency.
|Number of pages||9|
|Journal||Anesthesia and Analgesia|
|State||Published - Jul 1 2019|
Bibliographical noteFunding Information:
Funding: The project has been supported by the Foundation for Anesthesia Education and Research. A.R. received a research in education grant (REG-02-15-2015-Rebel; No. 3048112895).
The project has been supported by the Foundation for Anesthesia Education and Research. A.R. received a research in education grant (REG-02-15-2015-Rebel; No. 3048112895).
© 2018 International Anesthesia Research Society.
ASJC Scopus subject areas
- Anesthesiology and Pain Medicine