2023 ASEE Annual Conference & Exposition

A data-driven comparison of students’ performance in asynchronous online versus in-person sections of an introductory graduate statistics course

Presented at Graduate Studies Division (GSD) Technical Session 5: Graduate Student Experience and Decision-Making

Many institutions of higher learning have depended on their online programs to survive. In 2006 The Sloan Survey of Online Learning documented the growth of online education and showed that nearly 6 in 10 chief academic officers agree that e-learning is "critical to the long-term strategy of their institution” [1]. The COVID-19 pandemic energized that wave as all educational institutions sent students home and converted their instruction mode to online. The gain in momentum has sustained mainly because of the flexibility of time and space that online education affords students and faculty. Seventy one percent of students surveyed in 2021 reported they would continue at least some form of online learning even post-pandemic [2]. The popularity of online degree programs promises to continue in the future. While the climate is getting back to pre-Pandemic norms, many universities are experimenting in the fully online space. For example, some have started offering mini online sessions in between typical semesters, such as early in January while students are still away from the university before start of spring semester. However online teaching, particularly teaching quantitative subjects can be quite challenging. Of course, there are teaching strategies and technology resources that can be employed to provide online students with the same experiences as are available to their in-person counterparts. But, ultimately, the faculty must ensure students enrolled in online courses have the same learning outcomes as in-person students. This study explores four years of data from all 55 online and in-person sections of an introductory statistics course taught in a 4-year period for a total of 724 students. We explore grades on take-home midterm exams and proctored final exams to determine performance differences. Additionally, we compare performance of sections that are opened early and fill early to those that are opened later to test the conventional wisdom that dedicated/motivated students who are better performing enroll early and populate earlier course sections. Finally, we test differences in the overall course average between online and in-person sections.

Authors
  1. Majid Nabavi University of Nebraska - Lincoln [biography]
  2. Dr. Jena Shafai Asgarpoor University of Nebraska - Lincoln [biography]
Download paper (943 KB)

Are you a researcher? Would you like to cite this paper? Visit the ASEE document repository at peer.asee.org for more tools and easy citations.

» Download paper

« View session

For those interested in:

  • computer science
  • engineering
  • Faculty
  • Graduate
  • undergraduate