The purpose of this work was to test the inter-rater reliability (IRR) of a rubric used to grade technical reports in a senior-level chemical engineering laboratory course that has multiple instructors that grade deliverables. The rubric consisted of fifteen constructs that provided students detailed guidance on instructor expectations with respect to the report sections, formatting and technical writing aspects such as audience, context and purpose. Four student reports from previous years were scored using the rubric, and IRR was assessed using a two-way mixed, consistency, average-measures intraclass correlation (ICC) for each construct. Then, the instructors met as a group to discuss their scoring and reasoning. Multiple revisions were made to the rubric based on instructor feedback and constructs rated by ICC as poor. When fair or poor constructs were combined, the ICCs improved. In addition, the overall score construct continued to be rated as excellent, indicating that while different instructors may have variation at the individual construct level, they evaluate the overall quality of the report consistently. A key learning from this process was the importance of the instructor discussion around their reasoning for the scores and the importance of an ‘instructor orientation’ involving discussion and practice using the rubrics in the case of multiple instructors or a change in instructors. The developed rubric has the potential for broad applicability to engineering laboratory courses with technical writing components and could be adapted for alternative styles of technical writing genre.
Are you a researcher? Would you like to cite this paper? Visit the ASEE document repository at peer.asee.org for more tools and easy citations.