Introduction: Situational judgment tests (SJT) are commonly used in admissions to measure skills associated with professionalism. Although open-response SJTs have shown strong psychometric properties, assessors' personal beliefs, experiences, and cultural backgrounds may influence how they perceive, organize and evaluate information within test takers' diverse responses. Additionally, SJT research typically focuses on reliability and predictive validity, whereas the construct validity of open response SJTs remains underexplored. This mixed methods study aims to address this gap by exploring the construct-(ir)relevant factors that may impact assessors' evaluation of professionalism in open response SJTs.
Methods: For this study, we used data from Casper, an open response SJT commonly used in professional program admissions. In Study I, a quantitative content analysis was conducted on 160 responses to identify factors which were significant predictors of low and high scores. Correlation coefficients and logistic regression models were used to evaluate the relationship between each factor and response scores. In Study II, think-aloud activities were conducted with 23 Casper assessors to directly observe how they evaluated responses. All interviews were transcribed verbatim, which were then thematically analyzed using an inductive coding technique.
Results: Results from both the content analyses and think-aloud activities revealed that several construct relevant factors influenced scores. Scores were impacted by the extent to which test takers demonstrated the competencies probed for by the SJT, engaged with the context of the presented ethical dilemma, provided in-depth justifications for their response, considered various perspectives relevant to the presented dilemma, and provided creative solutions or insightful arguments for the suggested approach. Mixed results were found with respect to construct irrelevant factors, such as the flow, cohesion, and kinds of phrases used in the response.
Conclusion: This mixed methods study contributes to the construct validity of SJTs by investigating construct relevant and irrelevant factors that may impact assessors' evaluation of open responses. The findings of this study provide evidence that open-response SJTs are valid approaches to measure professional competencies more broadly, both in terms of what test takers focus on in their responses, as well as in terms of how they construct their responses.
Keywords: admissions; construct validity; open response scoring; personal skills; professional skills; professionalism; situational judgment tests.
Copyright © 2025 Iqbal, Ivan, Robb and Derby.