This Work in Progress (WIP) paper investigates the perceptions of biomedical engineering students toward the use of Large Language Model (LLM), such as ChatGPT, in their academic classes. The growing interest in LLM has led to their increased utilization in higher education settings, prompting various educational institutions to develop specific regulations and restrictions for their use. Our study aims to understand how these students view the integration of these sophisticated AI tools into their coursework and how it affects their learning experience.
To address these research questions, we conducted a survey involving approximately 60 undergraduate students from a Research 1 (R1) private university located in the Northeast region of the United States. The survey, conducted online in 2023, aimed to measure students’ experiences and perceptions of LLM, alongside a range of demographic variables. It included a variety of question types, such as Likert scale questions assessing the perceived utility of LLM in solving engineering problems.
Our analytical approach incorporated a suite of statistical methods to interpret the data effectively. We used descriptive analyses to provide an overview of the data distribution and central tendencies. T-tests and chi-square tests were employed to examine differences in perceptions and usage among different demographic groups. Furthermore, exploratory factor analysis was conducted to identify underlying factors that explain students’ self-efficacy and utility value with LLM.
Preliminary results have highlighted significant gender differences in the usage of LLM, with female students demonstrating lower usage rates at the time of the survey. In addition, students' academic standing influences their likelihood of using AI language model, with upper division students demonstrating higher usage rates than their lower division counterparts.
http://orcid.org/0000-0002-5662-0853
The George Washington University
[biography]
Are you a researcher? Would you like to cite this paper? Visit the ASEE document repository at peer.asee.org for more tools and easy citations.