2023 ASEE Annual Conference & Exposition

Board 400: The impact of Oral Exams on Engineering Students’ Learning

Presented at NSF Grantees Poster Session

This project aims to enhance students’ learning in engineering courses through oral exams. The adaptive dialogic nature of oral exams provides instructors an opportunity to better understand students’ thought process, thus holding promise for improving both assessment of conceptual mastery and students’ learning attitude and strategies. However, the issues of oral exam reliability, validity, and scalability have often been cited as important challenges.. As with any assessment format, careful design is needed to increasethe benefits to student learning whilereducingthe potential concerns. Compared to traditional written exams, oral exams have a unique design space, which involves a large range of parameters, including the type of oral assessment questions, grading criteria, how oral exams are administered, how questions are communicated and presented to the students, how feedback were provided, and other logistical perspectives such as weight of oral exam in overall course grade, frequency of oral assessment, etc.

The purpose of our project is to create a framework to integrate oral exams in core undergraduate engineering courses, complementing existing assessment strategies. Due to our focus on these core undergraduate courses, addressing the issue of scalability to large class sizes has been one of the primary focal points. Our solution revolves around the involvement of the entire instructional team (instructors and instructional assistants) in the assessment process. However, addressing scalability, brings the need to address bias and reliability back into focus. As such, our project is built upon two pilars: (1) create guidelines for instructors to build oral assessment implementations within a systematic design space exploration to maximize student learning outcomes; and (2) create a training program to prepare both faculty and teaching assistants to effectively and fairly administer oral exams. The project implements an iterative design strategy using an evidence-based approach of evaluation. The effectiveness of the oral exams isevaluated by tracking students’ performances on conceptual questions across consecutive oral exams in a single course, as well as across other courses.

In last year’s poster, we shared our progress in exploring the oral exam design space, developing Instructinoal Assistants training materials and project assessment instruments.In this year’s poster, we will present our findings from the first two years of the study: (1) An oral exam design guideline generated from a reflective and explorative dataset from 10 different engineering class oral exams implementation; (2) TA training methods; (3) students’ perception on oral exams; (4) instructors and teaching assistants experience and perceptions; (5) how to prepare students for orla exams.

Authors
  1. Dr. Huihui Qi University of California, San Diego [biography]
  2. Dr. Minju Kim Orcid 16x16http://orcid.org/0000-0001-5878-7350 University of California, San Diego [biography]
  3. Dr. Carolyn L Sandoval University of California, San Diego [biography]
  4. Zongnan Wang University of California, San Diego [biography]
  5. Prof. Curt Schurgers University of California, San Diego [biography]
  6. Dr. Saharnaz Baghdadchi University of California, San Diego [biography]
  7. Dr. Nathan Delson eGrove Education [biography]
  8. Dr. Maziar Ghazinejad University of California, San Diego [biography]
Download paper (2.57 MB)

Are you a researcher? Would you like to cite this paper? Visit the ASEE document repository at peer.asee.org for more tools and easy citations.