The use of computers as automated adaptive instructional tools to support students in STEM education continues to grow. However, these tools often focus on development of declarative knowledge and procedural skills; it is uncommon and challenging to develop adaptive learning tools that specifically focus on developing conceptual understanding. In part, this difficulty stems from limited understanding of how students’ conceptual knowledge emerges through interaction with these adaptive tools. In a previous study, we have explained the components and how we quantitatively tested the adaptive logic of a newly developed Crystallography Adaptive Learning Module (CALM) in materials science. In the current study, we use a knowledge-in-pieces framework that views learning as the activation and coordination of resources. We seek to identify and explicate student-tool interactions that may lead to or hinder the activation of conceptual resources leading to canonical understanding. Utilizing a qualitative think-aloud design, four students completed the CALM while being recorded and prompted to explain their thinking. Sessions lasted two to three hours per participant. Audio recordings of students thinking aloud were supplemented by video recordings of their screens as they completed the module. We also collected and analyzed the notes they wrote as they completed the CALM. Comparing across the four cases, the activation and coordination of resources was more idiosyncratic than we previously envisioned. For example, part of the CALM contains three two-part multiple-choice questions used for formative assessment with the initial question asking a conceptually challenging question and the follow-up question having the students select a response that aligns with their reasoning. We constructed the possible choices for reasons in the second question based on our analysis of students’ free responses in previous terms. While students found the follow-up choices provided on some questions align with their initial reasoning when they selected the answer from the first approach, on most questions they re-thought their choice based on the reasons provided. There were also instances where students responded based on how they interpreted the tool’s response should be. For example, the summative assessment was designed to be adaptive with students who answered a question correctly receiving a more difficult question and those who did not answer correctly receiving a less difficult question. However, sometimes, when correct, a student interpreted a similar question as an indication they were incorrect the first time. We also describe differences in the ways students negotiated uncertainty and how they engaged in the more extensive instructional tools. This paper contributes both to how students conceptually engage with complex materials science content and how student-technology interactions can support or hinder learning.
Are you a researcher? Would you like to cite this paper? Visit the ASEE document repository at peer.asee.org for more tools and easy citations.