Evaluating student performance assessment methods in Objective Structured Clinical Exam: perspectives and comparisons
Matthew Godwin, Amy J. Lin, Rahaf Bin Hamdan, Muath Aldosari, Luis Lopez, Sang E. ParkAbstract
Purpose
This study aims to evaluate how student performance and perspectives changed when the Objective Structured Clinical Exam (OSCE) assessment system was changed from a composite score to discipline‐specific grading at the Harvard School of Dental Medicine.
Methods
The retrospective study population consisted of all students (n = 349) who completed three OSCEs (OSCE 1, 2, and 3) as part of the predoctoral program during the years 2014–2023. Data on the students’ OSCE scores were obtained from the Office of Dental Education, and data on students’ race/ethnicity and gender were obtained from their admissions data.
Results
The likelihood of a student failing the OSCE after the assessment system change significantly increased with an adjusted odds ratio of 20.12. After the change, the number of failed subjects per student decreased with an adjusted mean ratio of 0.48. Students perceived the OSCE as being less useful after the change. Independent of the grading change, OSCEs 1 and 2 were seen as more useful compared to OSCE 3, which is administered in the last year of the Doctor of Dental Medicine program.
Conclusion
The discipline‐specific nature of the new assessment system helps focus on specific areas of remediation, rather than blanket remediation used previously, in order to isolate the actual areas of deficiency and to focus remediation efforts so that students can align their learning needs appropriately. Therefore, although the actual number of fails identified increased for the course, the assessment change has allowed for more directed, actionable information to be gained from the OSCE to prepare students to work toward competency standards.