This research brief presents initial insights into how AI-powered tools are influencing STEM education and pedagogical research, drawn from surveys we developed and conducted in Fall 2024.
Artificial intelligence (AI) has rapidly emerged as a transformative force in engineering education, particularly in the field of pedagogical research. While numerous studies have explored AI's effects on learning outcomes in early education settings—such as kindergarten, primary, and secondary schools—evidence on its efficacy in higher education is still limited. This project seeks to evaluate the impact of an AI-driven survey tool on STEM learning outcomes, with a specific focus on enhancing collaborative skills that are crucial in today’s educational landscape.
Effective teamwork is a fundamental component of STEM learning, especially given the growing emphasis on collaboration and communication skills in higher education. Yet, many college students face challenges in addressing complex, real-world problems due to insufficient collaborative abilities. To tackle this issue, we have developed an innovative AI-driven open-ended reflective tool that dynamically generates personalized questions based on students' prior responses, instructional context, and educational theories surrounding effective teamwork.
The tool is deployed across diverse student populations at Cornell University, specifically targeting students enrolled in two 1-credit courses: an upper level biomedical engineering course focusing on design and an introductory physics lab. Both courses feature intensive teamwork, with students actively engaged in collaborative projects. Teamwork skills are explicitly taught through structured activities, lectures and the use of team contracts. Furthermore, students in these courses participate in weekly reflections on various topics. The AI-powered tool will be deployed six times in the biomedical engineering course and three times in the physics course, reaching 455 students across both courses and gathering over 1,500 individual responses.
Through this research brief, we share our initial findings measuring the impact of the reflection instrument on engineering and physics students’ responses using a randomized controlled trial. We hypothesize that students will engage in deeper reflection when prompted with personalized questions. Data will be collected through an AI-integrated Qualtrics platform. We use qualitative methods to analyze the reflection responses. An a priori coding scheme based on socially-shared regulation of learning, as well as a coding scheme derived from work by Wong et al. (1995) and Rogers et al. (2019), indicating reflective skills. We also employ quantitative methods including keyword matching and response length as metrics for analysis. Ultimately, our work seeks to contribute significantly to the ongoing trend on integrating AI in education research, paving the way for more effective teaching and learning strategies in STEM disciplines.
The full paper will be available to logged in and registered conference attendees once the conference starts on June 22, 2025, and to all visitors after the conference ends on June 25, 2025