This Complete Evidence-based Practice paper details good design and implementation practices for the evaluation of engineering outreach programs specifically designed to serve community college students with the goal of sharing good practices that were found to be helpful in understanding the outcomes of the program for the participants, and the assessment of the efficacy of the program itself.
After students are recruited to and participate in engineering outreach programs, it is crucial to continuously assess their needs and experiences during the program, as well as to assess the effectiveness of the program activities designed to address these needs after the program ends. Evaluating how the program’s activities, resources, and materials impact its participants is essential for stakeholders to determine whether the students are receiving adequate support, whether the content effectively serves its intended purpose, and to guide informed decisions for program enhancements. This paper explores the design, implementation processes, and materials used for program evaluation, through considering an example of an engineering outreach program for community college students, the Aeronautics and Astronautics Community Research Experience (AACRE) program, implemented in the Aeronautics and Astronautics department at Stanford University in the United States.
The function and efficacy of the program’s designs, processes and materials were considered to extract a set of good practices for the evaluation of higher education engineering outreach programs designed for community college students. The work identified how objective evaluation of all program aspects, including the outcomes of participants, stakeholders and the program itself uncover rich sources of improvement that drive program enhancement. Pre-program surveys help identify and tailor program resources to participants’ and stakeholders’ needs and should be designed to be accessible and jargon free. Post-program evaluation should include both formal and informal methods. Formal methods include surveys, department-level and institutional-level meetings, and evaluation of the program and participants’ outcomes using quantitative metrics. The metrics should be informed by the goals of the program and participants, such as measuring the change in engineering skills efficacy, and standardized to enable year-to-year program comparisons. Informal methods such as check-ins and ‘watercooler chats’ yield rich sources of anecdotal improvements that may not be otherwise captured in formal feedback mechanisms. The evaluation data are valuable for demonstrating the program’s impact, and should be used for program promotion, and to communicate the value the program offers to both potential participants, and to stakeholders including the institution supporting the program.
This work will be of interest to similar engineering outreach programs designed to support community college students, by providing insights that can help enhance and inform their program evaluation design and implementation.
The full paper will be available to logged in and registered conference attendees once the conference starts on June 22, 2025, and to all visitors after the conference ends on June 25, 2025