Cognitive, or “think aloud,” interviewing techniques can allow for researchers to improve the development of new measures and provide validity evidence to measured constructs [1]. As a methodology, cognitive interviewing is widely used across many social sciences disciplines [2], including recent implementations in engineering education research studies as a piloting strategy (e.g., [3] [4] [5]). In mixed methods research designs, cognitive interviewing can improve the triangulation of data sources [6] [7]. We present two studies in which cognitive interviewing think aloud methods were used, and in which many of the authors were new to cognitive interviewing as a technique. In one study, one interviewer conducted cognitive interviews with 13 graduate engineering students and in a second study, two interviewers interviewed 13 undergraduate engineering students. Both studies used an iterative process to revise items between interviews and both studies used a field notes technique to modify novel surveys. Differences between these two cases are presented, including advantages and disadvantages of using multiple interviewers for cognitive interviews and commentary about the modality of cognitive interview environments (e.g., in-person vs virtual, how the questions are presented). Lessons learned from this process are presented, including the importance of repeating instructions, guidelines for making question language consistent, the processes of constructing probing questions and implementing field note methodologies, suggestions for identifying and eliminating idiomatic language and phrases which are confusing to non-native English speakers, removing items during the cognitive interviewing iterations, and using cognitive interviews as a member checking technique for mixed methods studies.
References
[1] Willis, G. B. (2004). Cognitive interviewing: A tool for improving questionnaire design. Sage Publications.
[2] Desimone, L. M. & Le Floch, K. C., (2004). Are we asking the right questions? Using cognitive interviews to improve surveys in education research. Educational Evaluation and Policy Analysis, (26)1, 1–22.
[3] Choe, N. H. & M. Borrego, M., (2020). Master’s and doctoral engineering students’ interest in industry, academia, and government careers, Journal of Engineering Education, (109)2, 325-346.
[4] Canney, N. E., & Bielefeldt, A. R., & Rulifson, G. (2016, June), Exploring interviews as validity evidence for the engineering professional responsibility assessment, Paper presented at 2016 ASEE Annual Conference & Exposition, New Orleans, Louisiana.
[5] Fletcher, T. L., & Strong, A. C., & Jefferson, J. P., & Moten, J., & Park, S. E., & Adams, D. J. (2021, July), Exploring the excellence of HBCU scientists and engineers: The development of an alumni success instrument linking undergraduate experiences to graduate pathways, Paper presented at 2021 ASEE Virtual Annual Conference, Virtual Conference.
[6] Greene, J. C., Caracelli, V. J., & Graham, W. F. (1989). Toward a conceptual framework for mixed-method evaluation designs. Educational Evaluation and Policy Analysis, 11(3), 255-274.
[7] Ouimet, J.A., Bunnage, J.C., Carini, R.M. et al. Using focus groups, expert advice, and cognitive interviews to establish the validity of a college student survey. Research in Higher Education, (45), 233–250 (2004).
Are you a researcher? Would you like to cite this paper? Visit the ASEE document repository at peer.asee.org for more tools and easy citations.