This work-in-progress (WIP) paper examines how different Artificial intelligence (AI) tool instruction strategies influence undergraduate electrical and computer engineering (ECE) students’ learning experience. The study was conducted across two undergraduate ECE courses at a large R1 university: a junior-level electromagnetics course and a sophomore-level signal processing course. In the junior-level course, a course-specific AI assistant was embedded into the course Learning Management System (LMS). It was trained exclusively using course materials, including the syllabus, lecture slides, homework sets, and lecture recordings, etc. The tool was intentionally designed to support conceptual understanding and course concept navigation while limiting step-by-step solution generation. In contrast, the sophomore-level course framed AI as a general-purpose problem-solving tool, emphasizing output verification and reflective evaluation rather than course-specific guidance.
At the end of the Spring 2025 semester, students in both courses completed an extra credit assignment and a survey to reflect on their experience with AI tools. The survey measured students’ opinions on the AI tool usability, helpfulness, accuracy, etc. Based on the survey result, more frequent AI use was moderately associated with more positive learning-related perceptions across both courses. Different AI integration strategies didn’t affect students’ learning experience with statistical significance. Also, results indicated that instructional emphasis on critical evaluation may shape how students perceive responsible AI use.
http://orcid.org/0000-0002-1239-185X
University of Illinois at Urbana - Champaign
[biography]
The full paper will be available to logged in and registered conference attendees once the conference starts on June 21, 2026, and to all visitors after the conference ends on June 24, 2026