The purpose of this WIP research paper is to understand how computer science (CS) students perceive their department climates. University CS departments have long attributed a lack of diversity to perceived “deficits” in groups who are historically underrepresented by race, gender, socioeconomic status, and/or disability status (e.g., lack of access to physical devices, K-12 computing courses, computational thinking skills, self-efficacy, and interest). However, a growing body of literature calls attention to departmental policies, practices, and climates that impact students’ sense of belonging and abilities to successfully navigate/complete CS majors.
This work was motivated by two challenges with existing computing climate surveys. First, the majority are designed for distribution by individual departments, with no method for cross-organization analysis and comparison. Second, the most-used climate survey that does provide cross-organization comparison has a long completion time, removes responses from demographics with less than five responses per item, and does not allow for open-ended responses. This literally “erases” students who are the least represented and most marginalized in a department, and likely discipline; eliminates opportunities for data disaggregation; and reinforces hegemonic department cultures that do not value these nuanced and often very different experiences.
This paper addresses the research question “how do postsecondary CS students perceive department cultures?” through the development and distribution of an instrument that measures student experiences in CS departments across seven major themes: 1) perceptions of inclusion efforts by department community members, 2) comfort discussing concerns with community members, 3) experiences of exclusion, 4) confidence in accommodations being provided (if needed), 5) physical presence questioned in computing spaces, 6) thoughts about changing majors, and 7) satisfaction with department inclusion efforts. Additionally, the survey collects demographic data on race, ethnicity, gender, classification, disability status, and first-generation status, allowing for disaggregation across multiple identities.
The survey was first distributed in the spring 2024 semester across 13 institutions in the U.S., with a total of 750 responses. Descriptive statistics were determined for closed-ended items, while a thematic analysis was applied to open-ended responses. Participating departments received organizational descriptive statistics and individual, deidentified open-ended responses. They also received aggregate descriptive statistics and high-level themes across all organizations.
Results indicated that across almost all identities, students from groups that are overrepresented in computing had the most favorable perceptions of and satisfaction with cross-department inclusion efforts, as well as comfort discussing departmental concerns. Students from groups that are the least represented experienced the most exclusion, questioning of their presence in physical spaces, thoughts of changing majors, and dissatisfaction with department inclusion efforts.
This novel instrument has several benefits. First, no responses are dropped, which provides departments with some of the most valuable information for broadening participation and allows for triaging departmental “pain points” based on concerns of the most marginalized students. Second, its annual distribution allows departments to longitudinally measure the impact of various interventions at an individual and discipline level. Finally, it can be distributed across postsecondary (non-)STEM departments.
The full paper will be available to logged in and registered conference attendees once the conference starts on June 22, 2025, and to all visitors after the conference ends on June 25, 2025