Undergraduate research can play a large role in diversifying STEM fields, giving many students from underrepresented groups the opportunity to explore research careers for the first time in their educational journey. However, research programs and opportunities may be biased against underrepresented students; this can occur due to implicit biases amongst interviewers, or due to the format of the selection process and the implemented scoring criteria or lack thereof. For example, unstructured interviews have been shown to favor white applicants over applicants of color, while structured interviews with predetermined questions are more equitable. Several studies have investigated these biases, typically within the context of medical and graduate school applications. Drawing on the recommendations of these studies, as well as the authors’ prior experiences interviewing applicants, we have developed a framework which seeks to minimize bias for interviewing and admitting students to STEM undergraduate research opportunities. The proposed framework is intended to assess applicants holistically, evaluating academic potential as well as an applicant’s ability to effectively and creatively collaborate with others. A written application is used as a screening tool, followed by an in-person interview. The interview process consists of a multiple mini-interview portion (MMI), a group activity portion, and a program overview. In the MMI segment, applicants are given 3 minutes to answer one question before rotating to the next station, with two interviewers assigned to each station. The group activity allows for applicant evaluation in a more relaxed setting as they collaborate with their peers. Finally the program overview ensures all applicants have a baseline understanding of the program goals and values. This framework exposes applicants to several different interviewers, mitigating individual biases, and additionally presents applicants with various opportunities to showcase their strengths both individually and in group settings. This process has also been designed to be time- and resource-efficient, which is crucial due to the typical mismatch between the number of applicants and number of reviewers available. This work-in-progress describes the process and the rationale behind its design, and reports on preliminary metrics of fairness and efficiency such as the demographic distribution of accepted applicants and the required time to evaluate an applicant cohort.
The full paper will be available to logged in and registered conference attendees once the conference starts on June 22, 2025, and to all visitors after the conference ends on June 25, 2025