TY - JOUR
T1 - The use of the Objective Structured Clinical Examination (OSCE) for evaluation and instruction in graduate medical education
AU - Sloan, David A.
AU - Donnelly, Michael B.
AU - Schwartz, Richard W.
AU - Felts, Janet L.
AU - Blue, Amy V.
AU - Strodel, William E.
PY - 1996/6
Y1 - 1996/6
N2 - This study had two purposes: determining the reliability and validity of the Objective Structured Clinical Examination (OSCE) in assessing performance by trainees at all levels, including medical students and chief residents; and estimating the impact of providing OSCE participants with immediate feedback about their performance. A comprehensive 210-min OSCE was administered to 53 surgical residents and 6 junior medical students. Faculty experts proctored all patient stations and provided immediate feedback to participants after the patient interaction segments (Part A). The participants then answered questions about the patients seen (Part B). The reliability of the OSCE was high (.91), identical to that of a previous resident OSCE with no feedback. The standard error of measurement for both parts was approximately 4%. At the 95% confidence interval, each participant's actual level of clinical performance (Part A) and clinical knowledge (Part B) could be estimated with an error of ±8%. Participants showed significant differences in clinical performance (Part A, P < 0.01) and knowledge (Part B, P < 0.01) by level of training. Most participants (74%) rated the OSCE as an above average or outstanding educational method. The OSCE is a valid and reliable test of residents' clinical skills. Feedback to participants during the OSCE was positively received and did not perturb test reliability.
AB - This study had two purposes: determining the reliability and validity of the Objective Structured Clinical Examination (OSCE) in assessing performance by trainees at all levels, including medical students and chief residents; and estimating the impact of providing OSCE participants with immediate feedback about their performance. A comprehensive 210-min OSCE was administered to 53 surgical residents and 6 junior medical students. Faculty experts proctored all patient stations and provided immediate feedback to participants after the patient interaction segments (Part A). The participants then answered questions about the patients seen (Part B). The reliability of the OSCE was high (.91), identical to that of a previous resident OSCE with no feedback. The standard error of measurement for both parts was approximately 4%. At the 95% confidence interval, each participant's actual level of clinical performance (Part A) and clinical knowledge (Part B) could be estimated with an error of ±8%. Participants showed significant differences in clinical performance (Part A, P < 0.01) and knowledge (Part B, P < 0.01) by level of training. Most participants (74%) rated the OSCE as an above average or outstanding educational method. The OSCE is a valid and reliable test of residents' clinical skills. Feedback to participants during the OSCE was positively received and did not perturb test reliability.
UR - http://www.scopus.com/inward/record.url?scp=0029977680&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=0029977680&partnerID=8YFLogxK
U2 - 10.1006/jsre.1996.0252
DO - 10.1006/jsre.1996.0252
M3 - Article
C2 - 8661202
AN - SCOPUS:0029977680
SN - 0022-4804
VL - 63
SP - 225
EP - 230
JO - Journal of Surgical Research
JF - Journal of Surgical Research
IS - 1
ER -