In this full empirical research paper, we aim to identify factors around students and their experiences using the Concept Warehouse (CW) when answering concept questions within mechanical engineering. Instructional practices centered on active learning have been shown to positively impact student outcomes like retention, engagement, and learning gains [1] - [6]. However, Freeman et al. [1] call for “second-generation research,” where researchers should explore the relationship between instructional practices and active learning, the intensity of active learning and learning gain, or other measures related to understanding active learning and its impacts. The use of educational technology to promote active learning has been previously evaluated; however, work still needs to be done to consider instructional practices, student perceptions, and the ecosystems in which technology is being implemented [3], [7] - [10].
The CW is a free web-based active learning tool and content repository to help instructors implement student-centered learning. The CW currently has over 1700 faculty, 40000 students, and over 3000 concept questions in various disciplines. Concept questions, commonly called ConcepTests [11], [12], are single-right-answer multiple-choice questions with little to no math involved that ask students about fundamental concepts they are learning. The abundance of resources and community support provides instructors an accessible gateway to concept-based learning at any point in their instructional journey.
Here, we investigate the factors that impact students' experiences with active learning using the CW. This study surveyed 448 students across a diverse set of two- and four-year institutions, asking them about their experiences using the CW in their mechanics classes. We then use exploratory factor analysis (EFA) [13] to describe the dimensions of student experiences around the usage of the CW. We scaffold our study with the following research question: What are the underlying factors that influence students and their experiences and perceptions around using the CW?
We found two factors for practices, which included 1) framing and 2) effort to understand and four factors for student perceptions, which included 1) positive affect and engagement, 2) deeper learning, 3) impacts on sensemaking, and 4) negative affect. This work contributes to research that evaluates student experiences and perceptions around active learning. Furthermore, this work can help inform instructional practices for instructors who aim to integrate concept-based learning into their classes.
Keywords: Educational technology, Quantitative, Factor analysis, Conceptual learning
References
[1] S. Freeman et al., “Active learning increases student performance in science, engineering, and mathematics,” Proc. Natl. Acad. Sci. U. S. A., vol. 111, no. 23, pp. 8410–8415, Jun. 2014, doi: 10.1073/pnas.1319030111.
[2] M. Prince, “Does active learning work? A Review of the research,” J. Eng. Educ., vol. 93, no. 3, pp. 223–231, 2004, doi: 10.1002/j.2168-9830.2004.tb00809.x.
[3] [1] M. D. Koretsky, S. B. Nolen, J. Galisky, H. Auby, and L. S. Grundy, “Progression from the mean: Cultivating instructors’ unique trajectories of practice using educational technology,” Journal of Engineering Education, vol. 113, no. 2, pp. 330–359, Feb. 2024, doi: 10.1002/jee.20586.
[4] R. R. Hake, “Interactive-engagement versus traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses,” Am. J. Phys., vol. 66, no. 1, pp. 64–74, Jan. 1998, doi: 10.1119/1.18809.
[5] T. Vickrey, K. Rosploch, R. Rahmanian, M. Pilarz, and M. Stains, “Research-based implementation of peer instruction: A literature review,” CBE—Life Sci. Educ., vol. 14, no. 1, p. es3, Mar. 2015, doi: 10.1187/cbe.14-11-0198.
[6] D. C. Haak, J. HilleRisLambers, E. Pitre, and S. Freeman, “Increased structure and active learning reduce the achievement gap in introductory biology,” Science, vol. 332, no. 6034, pp. 1213–1216, 2011.
[7] E. C. Miller, S. Severance, and J. Krajcik, “Motivating teaching, sustaining change in practice: Design principles for teacher learning in project-based learning contexts,” J. Sci. Teach. Educ., vol. 32, no. 7, pp. 757–779, Oct. 2021, doi: 10.1080/1046560X.2020.1864099.
[8] H. Auby, J. Galisky, S. Nolen, and M. D. Koretsky, “WIP: Instances of dynamic pedagogical decision making in the uptake of a technology tool,” in Proceedings of the 2022 American Society of Engineering Education Annual Conference & Exposition, Jun. 2022.
[9] R. Khatri, C. Henderson, R. Cole, J. E. Froyd, D. Friedrichsen, and C. Stanford, “Characteristics of well-propagated teaching innovations in undergraduate STEM,” Int. J. STEM Educ., vol. 4, no. 1, p. 2, Feb. 2017, doi: 10.1186/s40594-017-0056-5.
[10] C. Taylor et al., “Propagating the adoption of CS educational innovations,” in Proceedings Companion of the 23rd Annual ACM Conference on Innovation and Technology in Computer Science Education, in ITiCSE 2018 Companion. New York, NY, USA: Association for Computing Machinery, Jul. 2018, pp. 217–235. doi: 10.1145/3293881.3295785.
[11] E. Mazur, Peer Instruction: A User’s Manual. in Series in Educational Innovation. Prentice Hall, 1997.
[12] C. H. Crouch and E. Mazur, “Peer Instruction: Ten years of experience and results,” Am. J. Phys., vol. 69, no. 9, pp. 970–977, Sep. 2001, doi: 10.1119/1.1374249.
[13] L. R. Fabrigar and D. T. Wegener, Exploratory factor analysis. OUP USA, 2012.
The full paper will be available to logged in and registered conference attendees once the conference starts on June 22, 2025, and to all visitors after the conference ends on June 25, 2025