2026 ASEE Annual Conference & Exposition

Operationalizing Instructional Complexity for the Curriculum Complexity Methodology: Rubric Development

Presented at Analyzing and Evaluating Curricula

This empirically backed theory/methods full paper draws on interviews with undergraduate students to create rubrics for instructional complexity that supplements traditional quantitative curriculum complexity analytics. Literature studying curricular complexity in undergraduate engineering shows that students often face difficulties progressing in and completing their academic program due in part to complex and highly sequenced curricula. Curriculum complexity analysis is a methodological tool used to visualize and analyze curricula to understand the effects of its structure (i.e., pre-requisites, course sequencing) on student outcomes. Curriculum complexity is theorized to be comprised of two types of complexity: structural complexity, defined as the manner in which the curriculum is structured, and instructional complexity, defined as the manner in which courses are taught and supported. Currently, curriculum complexity analyses focus primarily on structural complexity, generating an overall complexity score based on the total number of courses and the extent to which those courses can block or delay future courses. Instructional complexity is qualitative in nature, so pass/fail rates are commonly used as a proxy variable. As a result, current uses of the curriculum complexity method do not capture the full nuance of how instructional complexity impacts student success, which can lead some stakeholders to assume that high complexity is automatically bad and should be reduced.

The purpose of this study is to develop a tool that accounts for instructional factors and can be used alongside structural complexity analyses to help contextualize complexity scores. This study draws on Reason, Terenzini, and Domingo’s (2006) conceptual framework for influences on undergraduate student learning and persistence to frame our data collection and analysis. We draw on 18 semi-structured interviews with undergraduate engineering students at a Midwestern university. Interviews were analyzed using a priori codes derived from literature and the theoretical framework followed by emergent coding. We display our findings in the form of two rubrics, where we capture how the curriculum could be either harder or easier to navigate depending on instructional and student factors. Instructional factors include workload, teaching styles, advising, and teaching assistants, while student factors include peer support and interactions, college readiness, and finances.

The key contribution of this work is the creation of two practical rubrics that surface the qualitative factors implicit in curricular complexity. We intend these rubrics to be simple tools that decision makers can use to contextualize the statistics they receive from a curriculum complexity analysis. Ultimately, having a tool to quantify the other factors influencing how students navigate complex curriculums can help stakeholders find nuance in curricular complexity and invest in mechanisms that can support students through complex curricula.

Authors
Note

The full paper will be available to logged in and registered conference attendees once the conference starts on June 21, 2026, and to all visitors after the conference ends on June 24, 2026