2024 ASEE Annual Conference & Exposition

Work in Progress: PEERSIST—An Observational Study of Student Questions to Identify Levels of Cognitive Processing

Presented at Educational Research and Methods Division (ERM) Technical Session 5

At [the institution], the PEERSIST (Peer-led, Student Instructed STudy group) model has been researched and recently fully implemented across all Thermodynamics recitation sections [1]. In the PEERSIST model students work together in small groups (4 to 5 peers) to solve challenging course related problems during weekly 50-minute recitation sections. While a facilitator observes and offers assistance to keep the group on track, the PEERSIST model relies upon student-student discussion and interactions as the primary method of learning, as opposed to direct instruction from an instructor or TA. Implementation of the PEERSIST model has improved course grades and pass rates compared to the typical TA-led Recitation (TAR) method, which involves approximately 30 students watching a TA solve problems on a board.
Following up on a previous observation study of students working in the PEERSIST model, the research team noted differences in the questions students asked and the resulting discussion that occurred with their peers during the sessions [2]. These differences are hypothesized to be linked to students' cognitive processing and course achievement and lead to the following questions: 1) What level of discipline-based cognition are students reaching based on question and discussion occurring during the PEERSIST model?; and 2) To what extent do GPA and student demographics influence the questions asked and resulting discussion? To answer these questions, the research team developed an observation protocol to explore student cognition during the study group sessions. Using the protocol, the researcher records questions asked by students working in 6 different study groups over a 20-minute period. Of these groups, 4 of them consisted of students with an average GPA of at least 3.74. These groups consisted of 1) 4 White men, 2) 3 White women and 1 woman of unknown minority status, 3) 3 international students and 2 students of unknown minority status, and 4) 2 Hispanic/Latino students and 2 white students. The remaining 2 groups included 3 out of 5 Hispanic/Latino students and 3 out 4 Hispanic/Latino students with 3.0+ and 2.0+ GPAs, respectively. The researchers noted that the questions could be categorized using Bloom’s revised taxonomy, which is a method for classifying learning objectives [3, 4]. Using the taxonomy the recorded questions were classified based on whether the students were trying to recall basic course information (remember), interpret information (understand), execute a process (apply), or organize and relate material (analyze) [3]. In comparison, observations during a TAR indicate that most questions asked, if any, are more focused on whether or not the answer is correct, while some questions explore course definitions. Preliminary results from the students participating in the PEERSIST model show they primarily ask questions that demonstrate different levels of Bloom's taxonomy such as remember, understand, and analysis of the course content. These results imply a higher level of course understanding and mastery of subject material provided and will be investigated for connections to the groups incoming GPA and final course achievement during the Fall 2023 semester.

[1] R. Milcarek, S. Brunhaver, T. Ta, C. Jenkins, G. Lichtenstein, and K. Smith, “The Impact of Peer-led Study Groups on Student Performance in a Mid-level Discipline-specific Engineering Gateway Course,” unpublished.
[2] C. D. Jenkins, T. N. Y. Ta, R. J. Milcarek, G. Lichtenstein, S. R. Brunhaver, K. A. Smith, “Work in Progress: PEERSIST – A Formation of Engineers Framework for Understanding Self-Efficacy and Persistence among Transfer Students,” presented at 2023 ASEE Annual Conference & Exposition, Baltimore, Maryland. https://peer.asee.org/44325
[3] P. Armstrong. “Bloom’s Taxonomy.” https://cft.vanderbilt.edu/guides-sub-pages/blooms-taxonomy/ (accessed Oct. 10, 2023)
[4] L. W. Anderson and D. R. Krathwohl, eds. A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives. New York, United States: Longman, 2001

Authors
  1. Sarah Johnston Arizona State University
  2. Ms. Thien Ngoc Y. Ta Arizona State University, Polytechnic Campus [biography]
  3. Dr. Ryan James Milcarek Arizona State University [biography]
  4. Dr. Gary Lichtenstein Arizona State University [biography]
  5. Dr. Karl A. Smith University of Minnesota, Twin Cities [biography]
Download paper (2.04 MB)