In a flipped, mastery-based Statics course, the students are tested on a single course problem every other week. Each problem is graded by the instructor using a rubric to score the different mastery objectives for the single problem. The mastery objectives are the key parts necessary to solve every statics problem and there are eight of them that require the students to include text, equations, or a drawing for that part. Following the in-class assessment, the students are asked to complete a self-assessment of their work where they grade themselves for each objective and comment on how they performed for each one. The qualitative data collected from the student comments have depth and richness of the students thoughts and reflection on their work in the course. Over the last four years, a massive amount of qualitative data has been collected through these self-assessments. For each assessment, there are around 780 student comments that get collected, which is done seven times throughout a semester for the last four years. Prior to this, it has not been feasible to analyze the student comments for meaningful results. However, the use of machine learning approaches has proven beneficial for understanding written comments, and natural language processing (NLP) is an area of machine learning that allows computers to process and begin to identify ideas in written text. This offers a systematic and efficient way to analyze the student reflections. For this study, the initial analysis of the student comments was completed using sentiment analysis. Sentiment analysis determines whether the text is positive, negative, or neutral. The results of the sentiment analysis are used to better understand the students attitudes towards the different parts of the statics problem as they reflect on their work. This paper will provide the results from the sentiment analysis for different assessments throughout a semester along with how the polarity of student comments compare to the student scores.
Are you a researcher? Would you like to cite this paper? Visit the ASEE document repository at peer.asee.org for more tools and easy citations.