Background: Engineering educators regard the ability to find, evaluate, and synthesize technical information as a core competency for engineering undergraduates [1], [2]. However, the content demands of STEM undergraduate curricula often limit the ability of instructors to teach students these skills [3]. Engineering librarians frequently partner with biomedical engineering (BME) educators to promote the development of these skills [4], but these partnerships are often limited to guest lectures within either first-year engineering programs or in senior design courses [5], [6], [7], [8], [9]. This bimodal implementation strategy suggests a gap in the middle years of the BME curriculum during which students may benefit from additional literacy training.
Purpose: This Work in Progress paper provides an update of a longitudinal assessment of BME students who have matriculated through a scaffolded information literacy training program. This assessment explores whether laboratory courses are an effective context for integrating information and data literacy into the undergraduate BME curricula.
Instructional Methods: Students at Vanderbilt University complete a required BME laboratory course as a sequential series of one credit courses in their sophomore (BME 2900W), junior (BME 3900W), and senior (BME 4901) years. To promote students' skill growth in the areas of information and data literacy, engineering librarians develop and deliver lectures in these courses that introduce students to specialized engineering information sources. These hands-on lectures teach students how to access technical resources efficiently and utilize them effectively. In BME 2900W, librarians demonstrate how to find methods papers, handbooks, and experimental protocols. In BME 3900W, students learned how to find review articles, patents, and engineering standards. In BME 4901W, librarians provide an overview of managing research data, including best practices for organizing files and designing machine-readable tabular data.
Methods: Beginning in the Spring 2022 semester, we initiated a longitudinal pre-test / post-test assessment of this program. Students completed pre-tests prior to the instructional intervention in BME 2900 to establish their baseline knowledge prior to the training program. Students completed the same pre-test prior to the BME 3900 intervention to assess skills gained and retained since the first intervention. Following the BME 4901 intervention, students completed a post-test to reassess skills gained and retained over the entire training sequence. The assessment instrument, which includes both multiple-choice and open response questions, is available on the Open Science Framework [10]. Student responses are collected via paper instruments and then saved in a spreadsheet with response identifiers, course information, and answers. This protocol was reviewed by the Vanderbilt University Institutional Review Board and was approved as a Quality Improvement project (IRB #232075).
The first portion of the assessment consists of a series of multiple-choice questions (Q3-Q12). We used Analysis of Variance (ANOVA) to compare the proportion of correct answers from each BME class (2900W, 3900W, 4901W). These were calculated using the stats package in R-4.3.1.
Results: Student performance on the multiple-choice questions across the three-time series (2900 Pre-test, 3900 Pre-test, and 4901 Post-test) are reported in Supplementary Table 1. These questions assess students’ ability to correctly identify: 1) sources of information to consult when completing different tasks (Q3-Q7); 2) which library-licensed resources to utilize when searching for different types of technical literature documents (Q8-Q11); and 3) what type of document the acronym “IMRAD” applies to (Q12). Because these pre-tests and post-tests were administered on paper to students in attendance in lab, the sample size across sections shows some variation.
Figure 1 shows the proportion of correct responses divided by class, question, and topic.
The ANOVA results indicate a significant increase in the number of correct responses to all test questions within each successive course (Mean square = 4.0; F-statistic = 25.9; p-value<0.001). The ANOVA results for course and question interaction term indicated a significant difference for questions across courses (Mean square = 0.43; F-statistic = 2.74; p-value<0.001). Since Q1-Q7 each address resource types, Q8-11 address tool knowledge, and Q12 is on article structure, these questions were combined into the three categories for comparison: Resource, Tool, and Reading. ANOVA results with the proportion of correct responses for each independent variable are reported in Supplementary Table 2.
Figure 1: Proportion of correct responses for (A) all questions combined, (B) individual multiple-choice questions, and (C) topics covered by the questions (Resource: 1-7, Tool: 8-11, Reading: 12) for each BME class. The error bars show the standard error for each point.
Discussion: These results suggest that targeted information and data literacy instruction, offered within laboratory courses, can contribute to science process skill gains for BME students. Students who completed this training sequence showed statistically significant improvement in their ability to identify the best sources of evidence to use to answer technical questions, as well as the ability to identify library licensed resources that could provide access to each type of evidence (Figure 1A). Crucially, these findings also suggest that these skill gains can be sustained over time; students completing the post-test following the conclusion of BME 4901 still demonstrated statistically significant improvements over their baseline scores entering the training sequence. The major exception to this finding is that students across the program showed no improvement in their recognition of the acronym “IMRaD,” which may suggest that this mnemonic has limited utility or relevance for undergraduate students (Figure 1B).
There was a substantial improvement in the results between the 2900 and 3900 courses. The largest improvements were in Q10 and Q11 (Figure 1B), which assessed their ability to identify which library-licensed resource can be used to find experimental protocols and engineering handbooks (Cold Spring Harbor and AccessScience, respectively). This is likely because these were newer tools to the students, while Q8 and Q9 both related to Web of Science, which is a more popular platform.
These findings highlight the value of demonstrating specialized engineering information tools to students within a BME laboratory course. These tools provide uniquely useful information for students expected to draft laboratory reports that cite primary and secondary literature sources, yet early-career undergraduate engineering students are unlikely to learn about these specialized tools within an information literacy training session designed for first-year students. These improvements in performance were retained overtime, which suggests that for many BME students, a single instructional intervention with a librarian within a laboratory course can promote material information and data literacy skill gains.
Future Work: While these are promising findings, the multiple-choice questions in these tests do not measure how well students can use technical literature; rather, they test recognition of resource types, tools, and article structure. To supplement these multiple-choice questions, students were also presented with three open response questions (Q13-15) that asked them to share their understanding of academic citation practices, methods for reading scientific literature, and approaches to managing research data.
Analysis of these open response data is currently ongoing. These open response data will be qualitatively analyzed using grounded theory to inductively identify themes and sentiments within the data [11], [12]. This qualitative analysis may reveal changes over the course of the training program in student’s understanding of effective strategies for citing evidence, working with primary literature, and managing research data that are not observable within their responses to multiple-choice questions. Preliminary findings from this qualitative analysis will be shared via a Work in Progress poster at the Annual Meeting in June 2025.
The full paper will be available to logged in and registered conference attendees once the conference starts on June 22, 2025, and to all visitors after the conference ends on June 25, 2025