Revisiting Assessment Tools Used to Measure the Impact of Summer Program Interventions on Perceptions and Interest in Engineering Among Underrepresented Pre-College Students – A Work in Progress
Pre-college, Race/Ethnicity, Gender, Engineering
Students start their education in science, technology, engineering, and mathematics (STEM) fields with the aim of having STEM-related careers. However, because of the high demands of such programs, many students drop out, change majors, or do not have STEM-related careers in the future, especially the ones who are historically underrepresented in such areas [1]. It is important to retain and attract students in STEM fields by understanding their experiences in such programs. The expectations for success, sense of belonging, interest, and perceived relevance are critical factors for students to embrace; therefore, interventions that effectively foster these perceptions, particularly early in their education such as during high school, can significantly enhance the retention and success of underrepresented groups in STEM fields. Four such programs exist at a large, four-year institution in the Southeast. An area of programmatic development that this work is focused on is the revision and assessment of the tools to ensure they effectively capture the nuances of participants’ experiences and identify any challenges encountered in their implementation. This work-in-progress describes ongoing work that began in the Spring of 2024.
The assessment of three of the four programs includes a pre-focus group, pre-survey, week 1 activity survey, week 2 activity survey, post-focus group, and post survey to capture the effect of the programs. The fourth program assessment included pre-focus group and post-focus group. Through this longitudinal data collection approach, we aimed to comprehensively capture the evolution of participants’ experiences over time, both quantitatively and qualitatively. As an example, one such detail that has been discovered is the difficulty in cleaning survey data across various platforms used for response collection (e.g., Google Forms). Additionally, due to the age group of participants, details related to the focus group room setup and comfort of the students is an additional area of consideration explored in this work-in-progress. We seek to present lessons learned from a practitioner and assessment standpoint to aid in strategically reconsidering both response collection from a quantitative and qualitative perspective.
The full paper will be available to logged in and registered conference attendees once the conference starts on February 9, 2025, and to all visitors after the conference ends on February 11, 2025