A multi-dimensional survey was created and administered to two different student cohorts to better understand the change in self-perceived and actual student abilities in a CHE laboratory course. One cohort experienced a traditional lab structure with a companion face-to-face lecture course (N=47), and the other cohort included pre-lab modules integrated with in-lab activities that served as intentional scaffolding for the student learning experience (N=18).
Prior work explored student experiences in the laboratory by analyzing survey results from the Self-Assessment and Direct Skills Test [1-4]. These assessments contained primarily close-ended questions with some open-ended prompts. The overall study was motivated by the desire to understand the impact curriculum revisions have on student experience and abilities, with the goal to improve the educational experience using evidence-based practices. The original guiding research questions that drove this facet of the study were:
What are the perceived objectives and perceived learning experiences of students in our CHE lab? To what extent does it differ between the traditional course and the revised course and over time?
Distinct from prior work, the methodology for this effort followed the six phases for thematic analysis outlined by Braun and Clarke [5] and was applied using a phenomenological lens where the authors seek to describe different ways a group of people (chemical engineering students) understand a phenomenon (CHE laboratory course). Through this lens, the authors considered student responses to one open-ended question asked both at the beginning and at the end of the course. The question related to student expectations (before) and capabilities (after). Semantic and latent content were considered, and an inductive approach to identifying themes was applied. This work documents the process of applying those six phases, as well as the exploration of initial frameworks for coded thematic elements. We present codes and themes that emerged from the combined cohorts and discuss the extent to which those themes differ and evolve between the two cohorts. In support of these themes we present "quantitized" data visualized in a variety of ways as well as selected excerpts of student responses.
In addition to reporting on the research question itself, this paper will serve as a process guide for analysis of a small set of qualitative data in the context of chemical engineering education. The intent is to make thematic analysis more accessible for faculty who might otherwise not consider this approach in pedagogical work.
[1] G. Neumann, D. Anastasio, H. Chenette, and T. Ribera, “Work in Progress: Developing a Multi-dimensional Method for Student Assessment in Chemical Engineering Laboratory Courses,” presented at 2018 ASEE Annual Conference.
[2] G. Neumann, D. Anastasio, and H. Chenette, (2019, November), Using Multidimensional Metrics to Assess Changes to Student Attitudes and Ability in a Capstone Laboratory Sequence presented at 2019 AIChE Annual Meeting.
[3] H. Chenette, G. Neumann, and D. Anastasio, “What’s Happening in Lab? Multi-Dimensional Assessment Tools to Track Student Experience through a Unit Operations Laboratory Sequence,” Chemical Engineering Education, vol. 55, no. 3, pp. 147–156, 2021.
[4] H. Chenette, D. Anastasio, and G. Neumann, “Qualitative Analysis of Skills in a CHE Laboratory Course,” in 2021 ASEE Virtual Annual Conference Content Access, 2021.
[5] V. Braun and V. Clarke, “Using thematic analysis in psychology,” Qualitative Research in Psychology, vol. 3, no. 2, pp. 77–101, Jan. 2006, doi: 10.1191/1478088706qp063oa.
Are you a researcher? Would you like to cite this paper? Visit the ASEE document repository at peer.asee.org for more tools and easy citations.