Background
This paper assesses students’ English-language writing levels by analysing their written work in a ‘history of Japan’ module, a humanities component in an engineering program at the study authors’ university. We investigate connections between students' written English levels and their overall academic performance as embodied by exam grades. We explore how future iterations of the course might be enhanced, in order to increase its effectiveness as a vehicle for developing students' English writing abilities, creativity, and 'global mindset'. The student body for this module, 'Science and Religion in Japan', consists entirely of international students and the course is taught exclusively in English. To advance participants’ logical reasoning capabilities, they are required to write summary-and-response papers as their one their key weekly assignments. With a diverse cohort of students from countries with varying baseline levels of English proficiency, the authors have observed over several years that while most students can articulate their thoughts effectively in verbal discussions, their writing clarity varies substantially. Naturally, we want all international students to gain as much knowledge as they can from the course, without this being completely contingent on their pre-existing level of English. We therefore continuously look for ways to optimise all aspects of the course format - especially teaching materials and assessment methods - in order to match student needs as closely as possible, ensuring that the program teaches Japanese history and intercultural skills effectively to students no matter what their background. To this end, the study also investigates relationships between the CEFR-J levels displayed in students' writing assignments for individual topics, and those seen in the corresponding lecture materials. Does the level of English produced by the student closely 'mirror' that of the study material, which could indicate patch writing? This part of the research provides insights into how course materials and instructions can be better designed to prevent patch writing, and to instead encourage students to develop their own writing skills to the greatest extent possible.
Methodology
To evaluate students’ written English levels, we utilised the ‘CVLA (v2.0)’ tool to assess students’ written English levels, and then compared the results with their course grades. CVLA stands for ‘Common European Framework of Reference for Languages (CEFR)-based Vocabulary Level Analyzer.’
Results
Using the CVLA, two CEFR-J scores for each of the 22 students taking the course were computed. The first was based on their written output in weeks 1-3 of the course, and the second was based on their output in the concluding weeks 11-13. On the CEFR-J scale, ‘Pre-A1’ is the lowest and ‘C2’ the highest score attainable. To make quantitative analysis more user-friendly, CEFR-J scores can also be expressed as numbers between 0.5 and 6.0. When we conducted quantitative analysis on the data from the first three weeks of our course, the mean average of students’ CEFR-J levels was 4.955 (B2.2-C1), whereas for the last three weeks the mean was 5.432 (C1-C2); this difference in average scores was shown to be statistically significant when the appropriate significance tests were applied.
Are you a researcher? Would you like to cite this paper? Visit the ASEE document repository at peer.asee.org for more tools and easy citations.