2024 ASEE Annual Conference & Exposition

A Comparative Analysis of Natural Language Processing Techniques for Analyzing Student Feedback about TA Support

Presented at DSA Technical Session 8

This paper advances the exploration of Natural Language Processing (NLP) for automated coding and analysis of short-answer, text-based data collected from student feedback. Specifically, it assesses student preferences for Teaching Assistant (TA) support in engineering courses at a large public research university. This work complements existing research with an in-depth comparative analysis of NLP approaches to examining qualitative data within the realm of engineering education, utilizing survey data (training set = 1359, test set = 341) collected from 2017 to 2022. The challenges and intricacies of multiple types of classification errors emerging from five NLP methods are highlighted: Latent Dirichlet Allocation (LDA), Non-Negative Matrix Factorization (NMF), BERTopic, Latent Semantic Analysis (LSA), and Principal Component Analysis (PCA). These results are compared with results from traditional thematic analysis conducted by a domain expert to assess their efficacy. Two principal findings emerged for TA teaching practice and for the use of NLP in education research. Firstly, the conclusions derived from each coding technique are consistent, demonstrating that students want, in order of priority, extensive TA-student interactions, problem-solving support, and experiential/laboratory learning support from TAs. Secondly, the research offers insights into the effectiveness of NMF and PCA in processing educational survey data. A comparison of NMF and PCA topic models, based on accuracy, precision, recall, and F1 scores, reveals that while PCA outperforms NMF in terms of precision (identifying truly relevant topics), NMF excels in recall (capturing a broader range of student responses).

Authors
  1. Neha Kardam University of Washington [biography]
Download paper (1.88 MB)

Are you a researcher? Would you like to cite this paper? Visit the ASEE document repository at peer.asee.org for more tools and easy citations.