This evidence-based practice paper is a follow-up to an ASEE 2022 conference proceeding that was focused on the challenges in development, in addition to resulting student perceptions upon delivery, of a remote iteration (Spring 2021, due to the COVID-19 pandemic) of a conventionally hands-on, active learning-based makerspace course; of which employs integration and application of fundamental engineering skills and all institutional first-year engineering students are required to take. Specifically, this paper is focused on the ensuing iteration of the course (Spring 2022) in which students resumed in-person course execution, and aims to disseminate comparative resulting student perceptions on course topics and features between the remote iteration versus the in-person iteration.
Due to apparent success, several pedagogical features developed exclusively for remote course delivery were retained as pedagogical features for the subsequent in-person iteration. For example, upon return to in-person delivery, course administrators included the engineering design project developed exclusively for the remote cohort with the design project previously utilized in the course prior to the pandemic, resulting in two total design projects employed within the post-pandemic course iteration. At the conclusion of this semester, 314 student participants were surveyed on which of these two design projects they would keep in the course if they could only choose one, and the students were additionally presented a qualitative follow-up on specifying why they chose their selected design project. Student participants were further surveyed with three additional queries; two of which included quantitative force-choice rankings. The first of these asked students to rank pedagogical effectiveness of six select course topics – 3D modeling, circuitry, engineering design, programming, and teamwork – in the same manner the remote cohort was surveyed (and results reported in the 2022 conference proceeding), while the second asked students to rank pedagogical usefulness of four different course features utilized. These four features, including classroom response systems, MS Teams, supplemental videos, and Tinkercad, represent course features that were introduced and/or significantly augmented during the remote iteration that were retained for the in-person iteration. The final survey question was a qualitative inquiry asking students to specify why they ranked feature usefulness in the manner they did.
All related survey data has been collected and compiled. Course administrators are currently in the process of assessing the quantitative data in comparison of engineering design project choices, comparison of remote student responses versus in-person student responses related to ranking effectiveness of the six course topics, and quantitative assessment on the student usefulness ranking of the four course features retained from the remote iteration. Thematic coding and related assessment of the two aforementioned qualitative components of this study are also in progress. The authors of this paper are confident that the aforementioned quantitative data analysis and qualitative coding and assessment will be completed prior to the conference deadline for full paper submission(s). Resultant implications, limitations, and revelations of these findings will be discussed accordingly.
Are you a researcher? Would you like to cite this paper? Visit the ASEE document repository at peer.asee.org for more tools and easy citations.