Purpose-built AI tutors embedded in learning management systems (LMS) may enhance learning and engagement, yet evidence from authentic deployments remains limited. We developed an AI Course Companion tool that grounds answers in instructor-uploaded lecture materials and generates practice questions patterned on prior exams and homework to support targeted studying. Pedagogically, the design implements retrieval practice (exam-style generation), worked-example scaffolding (stepwise explanations), and tight content alignment to reduce extraneous load, consistent with constructivist, cognitive-load, and formative-assessment perspectives.
We evaluated usability and instructional impact of a course-specific, LMS-embedded companion, featuring lecture grounding and prior-assessment-based practice, offered as an optional, ungraded tool in undergraduate computing courses at a large public university. A mixed-methods, one-semester study spanned six courses. Data sources include an end-of-term student survey, an instructor survey, 20-minute interviews on effectiveness and impact, and anonymized usage analytics (feature-level events over time). Primary outcomes were student-reported effectiveness and usability; secondary outcomes were instructor-reported instructional impact (e.g., fewer repetitive questions, alignment); exploratory outcomes were behavioral usage metrics.
Findings indicated broad endorsement alongside clear improvement areas. In recommendation items, 81% of students and 100% of instructors endorsed the tool for future offerings. Students credited course-aligned practice, stepwise explanations, and targeted summaries with improving study efficiency, especially for exams; the most valued attribute was tight alignment to course artifacts (slides, homework, exam style). Pedagogically, it aligned with course objectives and supported asynchronous/online learning by enabling personalized study; with clear guidelines, it augmented rather than replaced student thinking. Requested enhancements included deeper lecture integration, auto-generated practice exams by topic/slide, improved cross-document linking (slides, labs, readings), and a student onboarding assistant; instructors recommended early, planned pre-semester adoption. Interview synthesis underscored that the tool’s chief strength was tight alignment with course materials, yielding focused, on-topic responses. It also indicated benefits for proactive students, beginners, repeaters, and regular LMS users. Usage analytics showed increased engagement with the practice-questions feature in the days preceding exams.
http://orcid.org/0009-0003-4170-0521
University of South Florida
[biography]
The full paper will be available to logged in and registered conference attendees once the conference starts on June 21, 2026, and to all visitors after the conference ends on June 24, 2026