As peer mentoring increasingly complements professional advising in academic settings, ensuring effective mentor training remains challenging, particularly due to high turnover rates from student graduation. This study introduces a Talk-Move Framework that leverages Transformer models, specifically RoBERTa, to automate discourse analysis in peer mentor-mentee interactions. We call this framework PEER HELPER: Peer Engagement for Effective Reflection, Holistic Engineering Learning, Planning, and Encouraging Reflection. Building on established mentoring theories, our analysis framework categorizes dialogues into five key areas: Goal Setting and Planning, Problem Solving and Critical Thinking, Understanding and Clarification, Feedback and Support, and Exploration and Reflection. Using annotated mentoring data from the University of Florida and pre-trained insights from the DSTC7 dataset, the RoBERTa-based model achieved high classification performance, with an accuracy of 94.4%, an F1-score of 0.990, precision of 0.991. These results demonstrate the model’s potential to accurately and systematically analyze mentoring dialogues, providing a reliable foundation for further development of AI-powered mentor training tools.
Are you a researcher? Would you like to cite this paper? Visit the ASEE document repository at peer.asee.org for more tools and easy citations.