A cornerstone of modern day academia is the use of student feedback as a means of obtaining data on the quality of courses. Students, being the recipients of the content have steadily continued to be involved in both course and faculty evaluation systems. A good deal of research has explored the benefits of these surveys as tools for continuous improvement. A recurring theme in teaching pedagogy is the value and importance of this feedback as a means to improve learning experiences for students. While many collegiate classrooms try to use feedback as a structured, professional method to adapt and revamping courses, too often are the major comments not formally addressed until the end of the semester. Furthermore, students, being non-experts in teaching philosophies and outcome-based-learning often provide feedback that is either provided too late or irrelevant to the course at hand. This causes a major disconnect and time-delay in identifying, adjusting, and implementing any potential changes from meaningful insights. The work presented herein discusses how to strengthen a course with the use of Anonymous Informal Mid-semester (AIM) feedback. By exploring how to design the questionnaire to maximize effectiveness, faculty can target specific aspects of their course to adapt for any given student cohort. The manuscript covers both classical research on questionnaire creation in addition to practical implementations that improve the quality of student responses. Furthermore, by generating a dialogue between learners and instructors, one can explain the pedagogical rationale behind specific decisions and further motivate students. This dialogue is propagated by summarizing general trends provided by AIM feedback and presenting them back to the class. Furthermore, the work discusses how to use the feedback to generate customized learning methods for individual learners, while retaining anonymity. Students can then customize their learning and personalize their instruction locally and be able to grow and learn in the method that suits them best. We then continue to discuss how lecturers should use the surveys to improve their class; specifically without compromising academic rigor in technical vocations. Some potential drawbacks for these mechanisms are also covered. The work concludes with analysis and examples of AIM feedback in multiple first-year electrical and computer engineering courses and how these improvements led to better knowledge retention and overall student satisfaction of the course.
The full paper will be available to logged in and registered conference attendees once the conference starts on June 22, 2025, and to all visitors after the conference ends on June 25, 2025