Engineering faculty are regularly encouraged to improve their courses as part of improving engineering education. Improvement efforts are often measured through an increase in the use of evidence-based practices. While inarguably important, achieving effective and high quality implementations of evidence based practices is not automatic or inherent in adopting their underlying principles. Oftentimes, it requires an iterative process of learning design - but that process is far less visible than implementation metrics. As both practitioners and researchers, we (the authors) see the process, tools, and even the existence of iterative implementations of evidence-based practices as useful to others. While there is a body of work on how to train, resource, or support faculty in changing their course, far less has focused on the experience faculty have in doing so. What happens when a change does not work as anticipated? What is the faculty experience of engaging in iterative changes to create robust implementations of evidence-based practices? How are evidence-based practices different once the novelty of a new change fades?
Our paper seeks to make visible the challenge, process, and results of iterating on initial implementations of evidence based practices. The motivation for this project was the instructor’s (i.e., lead author’s) dis-satisfaction with grading in a middle years rigid body dynamics course. He had implemented a new grading system two years prior that shifted from points to specifications-based grading. While happy with the switch to specifications grading, details of the initial implementation were resulting in untenable workloads and frustration for both students and the instructional team. That ongoing observation led to a process of observing and revising his initial implementation.
The full paper will present an autoethnographic study of the instructor’s experience and reasoning with revising his implementation of specifications grading. The study will use data from an extended ethnographic interview of the lead author (by the second author) as well as artifacts from the various iterations of the course grading scheme. We will focus on a set of longitudinal experiences that the faculty member undergoes after an initial large change in a course is made. Our hope is that the results will be useful for both educators and researchers. We expect that educators will identify with our results in ways that allow them to articulate and make sense of the curricular change as an iterative process, as opposed to a one time event that works or does not. We expect that researchers will find value in documentation of the experiences of faculty who live with and maintain course changes in the long term. Better understanding that experience will aid in developing resources to support faculty in refining and sustaining major changes to their courses, improving the outcomes of change efforts.
Are you a researcher? Would you like to cite this paper? Visit the ASEE document repository at peer.asee.org for more tools and easy citations.