2024 ASEE Annual Conference & Exposition

Tracking and Predicting Student Performance Across Different Semesters with Matched Action-State Orientation Surveys and Interventions

Presented at Community Building and Student Engagement

This paper presents the second year results of the work supported by the National Science Foundation’s Revolutionizing Engineering Departments (IUSE/PFE: RED) Program under the project titled "IUSE/PFE:RED: Breaking Boundaries: An Organized Revolution for the Professional Formation of Electrical Engineers." Specifically, this part of the study looks at action-state orientation and its impacts on student success. The first-year results were presented at the 2023 ASEE Conference in Baltimore, MD with the academic paper titled "to be included after blind review." for further reference (Reference after blind review, 2023). The objective of the first phase of the study was to find out how survey responses could be used to predict whether a student could be considered at-risk for failing academically. The objective of the second phase discussed in this article is to analyze and quantify the effects of in-class interventions on student study habits and, ultimately, their academic performance using action-state orientation surveys as engineering students progress in their respective curriculum.

While these surveys are anonymous, it is crucial to be able to track changes in academic performance for individual students across multiple semesters as they go through the various stages of their academic program (in this case, Electrical Engineering). As part of the second phase, we developed a powerful matching method that can automate the demographic information matching in the background with Python libraries to ensure sustainable analysis as the data collected from both new and ongoing students naturally grew larger over the past several years. Ultimately, we were able to match a total of 840 unique students based on their self-provided information such as gender, month of birth, ethnicity, and high school names across 2148 unique survey responses collected in 5 different academic semesters: Spring 2021, Fall 2021, Spring 2022, Fall 2022, and Spring 2023. Beyond the scale of data, which is unprecedented for this kind of survey, we were able to significantly boost the prediction performance of our machine learning algorithms from 74.4% reported in the previous study for a simpler question (i.e., is this student's GPA less than 2.0? - a more apparent anomaly) to 82.6% for a more challenging question (i.e., is this student's GPA higher or lower than 3.33? - a more subtle distinction).

To accomplish this, we leveraged sophisticated machine learning classifiers before settling on the random forest classifier with feature elimination, thanks to the increasing size of the collected data from the newly added surveys. The students in the dataset were split into two groups based on their GPA such that the method can learn from survey responses to correctly identify the category (high or low GPA) where k-fold (10) cross-validation was used to ensure robust and repeatable accuracy metrics were obtained. The dataset was further split into two partitions by classifying survey responses as pre-intervention and post-intervention, where 921 unique responses were classified as post-intervention (POST), and 1227 unique responses were classified as pre-intervention (PRE). Using this information, a new predictor was trained using only the PRE dataset and was tested on both the PRE and the POST datasets. The hypothesis was that the new empirical model would perform worse with more false-positives on the POST dataset due to newly acquired and hopefully improved study habits after the interventions. Our results show a 35% increase in prediction error when the same algorithm is tested on the POST student population and more importantly a corresponding 24% increase in the false-positive rate which indicates that the interventions are working at the population level where students adopt study habits that outperform their current academic performance as likely indicators.

Authors
  1. Prof. Ismail Uysal University of South Florida [biography]
  2. Mehmet Bugrahan Ayanoglu University of South Florida
  3. Rifatul Islam University of South Florida
  4. Paul E. Spector University of South Florida
  5. Dr. Chris S. Ferekides University of South Florida
Download paper (4.58 MB)

Are you a researcher? Would you like to cite this paper? Visit the ASEE document repository at peer.asee.org for more tools and easy citations.