The current landscape of introductory computer science education, from block-based environments like Scratch to text-based languages like Python, typically presumes a one-to-one student-to-device model. This paradigm creates significant financial and logistical barriers for many schools, limiting access to coding education. Furthermore, the model often fosters individual, heads-down work, minimizing opportunities for peer collaboration. Beyond the impacts of each student having their own device, a common pedagogical hurdle with introductory computer science education is that students must first learn a long vocabulary of code blocks or text-based syntax before starting to explore more foundational computational thinking skills like algorithmic thinking, problem decomposition, and abstraction.
This paper introduces SnapBots, a novel educational platform designed to circumvent these challenges. With SnapBots, students program by drawing flowcharts on paper. A single, teacher-operated computer uses a webcam to snap a photo of a students’ drawing, which is then processed by a computer vision and generative AI system that converts the hand-drawn diagram into executable code. This code then animates a character within a digital environment, providing immediate visual feedback on the student’s creation.
Each flowchart program is made up of states and transitions that a student describes using natural language within a flowchart diagram. Each state is represented by a circle, and the student writes inside the circle to describe what the character should do, from dancing to saying a piece of dialog to playing a sound. Each transition is an arrow that connects two states, and is accompanied by a few words that explain when the character should transition from one state to another. For example, there might be an arrow connecting the “Move Forward” state to the “Stop” state that’s labeled “When Near the Wall” instructing the character to stop before walking into a wall.
SnapBots is intentionally designed to foster collaborative learning. While each student can design their own program on paper, their resulting characters are all introduced into a shared digital playground projected for the entire class to see. This allows for complex interactions between student creations and encourages students to think about how their individual logic contributes to a larger, dynamic system. Students, each with their own pencil, can work together around a single sheet of paper to design agents that interact, solve group challenges, or create complex scenes, promoting communication and teamwork.
By centralizing the hardware requirements to a single computer and a webcam, SnapBots removes the need for a classroom set of laptops or tablets. This approach makes hands-on computational thinking activities accessible to many classrooms and other learning environments with fewer technological resources. It lowers the financial barrier to entry and simplifies classroom management, allowing the focus to remain on learning core concepts.
To evaluate this approach, an early version of the SnapBots platform was piloted in upper-elementary and middle school classrooms during the Fall 2025 semester. This initial in-person research found that while there were instances of misalignment between how the student imagined their character would act and how the generative AI system coded the character, the platform functioned reliably in a real-world classroom environment. Most students reported that the experience was enjoyable, and data from classroom observations and teacher interviews found that students engaged collaboratively in coding characters with hand-drawn flowcharts.
The full paper will be available to logged in and registered conference attendees once the conference starts on June 21, 2026, and to all visitors after the conference ends on June 24, 2026