May 2023 – Volume 27, Number 1
https://doi.org/10.55593/ej.27105a4
Hieu Manh Do
Murray State University, USA
<domanhhieubcgmail.com>
Abstract
This study aims to investigate Vietnamese students’ perceptions of portfolio assessment and how it affects their writing performance. Furthermore, the study explores problems encountered during the assessment process, which could offer professional support for teachers, particularly novice teachers, when applying this formative assessment to teaching writing. Data was collected through students’ written papers, observations, and semi-structured interviews with a total of thirteen low- and intermediate-level undergraduate students. The findings demonstrate the aspects of the assessment usefulness, namely validity, authenticity, interactive-ness, and impact. Accordingly, the assessment provides students with a comprehensive understanding of academic writing that enhances their writing abilities. In particular, students recognize their common grammatical errors, pay more attention to organizing ideas logically, and develop writing habits throughout the writing process, which have not been carefully addressed in prior education. Nevertheless, there are a number of issues that need to be carefully considered, including idea development, plagiarism knowledge, self- and peer-assessment, and the order of writing steps. In order to improve the reliability and practicality of the assessment, solutions to those concerns are provided.
Keywords: portfolio assessment, academic writing, assessment usefulness, EFL Vietnamese undergraduates
L2 writing is considered one of the most challenging skills for not only EFL learners in general (Alam & Aktar, 2019; Arslan & Gümüş, 2020; Nezakatgoo, 2011; Tseng, 2019), but also Vietnamese students in particular (Nguyen, 2021; Nguyen & Truong, 2021; Tonogbanua, 2018). In order to help students develop their writing abilities, assessment plays a salient role in teaching and learning. On the one hand, summative assessment (e.g., timed paper test) is defined as “a snapshot in that it provides a picture of students’ learning at a single moment in time” (McTighe & Ferrara, 2021, p. 4), so it may be unable to detect students’ writing growth and their academic literacy abilities (Ávila, 2012). Formative assessment like writing portfolio, on the other hand, is “a collection of written work, rather than a single writing sample” (Hamp-Lyons & Con, 2000), has become popular in educational programs (Abrar-ul-Hassan, 2021) as it provides an overview of a student’s development over time so that teachers can conceive the ways learners improved or declined (Hamp-Lyons & Con, 2000; McTighe & Ferrara, 2021).
In Vietnamese education, formative assessment has been overshadowed by summative assessment at all educational levels (Le, 2015; Nguyen & Truong, 2021; Pham & Truong, 2021; Tonogbanua, 2018). Le (2015), for instance, reflected this reality of tests by indicating that teachers in high schools use summative exams primarily to produce grades for classifying students rather than to measure their progress in their learning. Students, thus, are familiar with a testing culture where teachers and curriculum designers typically focus on the task-end products (summative assessment) rather than the process of development (formative assessment). As a result, students are taught solely with the purpose of passing high school exams (Nguyen, 2009; Tonogbanua, 2018). This then leads to a popular scenario in which students learn sample texts by rote before the exams, which has some negative impacts on their writing abilities and learning habits (Nguyen, 2021; Tonogbanua, 2018). Students, for example, are unable to write independently and have no idea where to begin. When they are forced to write anything different, they find it difficult to compose. In other words, students are taught about writing (Schmitt & Rodgers, 2020) and about tests (Le, 2015) through “reading and silence, on sitting and reciting” (Schiro, 2015, p. 106). In this case, students have not experienced writing through practice to become writers since writing requires much practice (Barnes et al., 2020). This learning habit is continually developed at the tertiary level in Vietnamese education, which attracts much concern from researchers and educators (Pham & Truong, 2021).
Since writing involves cognitive activities that occur over a longitudinal process of practice (Schmitt & Rodgers, 2020), portfolio assessment is supported in teaching writing for Vietnamese students in order to tackle the above writing problems (Nguyen & Truong, 2021; Tonogbanua, 2018). This study, therefore, investigates students’ perceptions of portfolio assessment and how it affects their writing development. Furthermore, the study explores problems that arise during the application process since diverse contexts with different ages, genders, learning styles, English proficiency, and knowledge backgrounds may have different outcomes (Tabatabaei & Assefi, 2012; Dinh, 2016). It is hoped that the study provides teachers, especially novice teachers, and stakeholders with theoretical and practical knowledge of portfolio assessment and problems related to this type of assessment when it is applied to particular students. The study was guided by the two following research questions:
- What are students’ perceptions of writing portfolio assessment?
- What issues should be taken into consideration in teaching/assessing students through writing portfolio assessment?
Literature Review
Writing Portfolio Framework
A writing portfolio comprises a collection of “students’ work that demonstrates to students and others their efforts, progress, and achievements in given areas” (Genesee & Upshur, 1996, p. 99). According to Ferris and Hedgcock (2014), portfolio assessment was recognized “as a valid and valuable tool for instruction and measurement in many contexts, including college composition, foreign language education” (p. 215). Throughout writing portfolio assessment, teachers can make inferences about students’ writing ability, evaluate their effort, and understand their needs (Abhakorn, 2014; Steward, 1993; Weigle, 2002). It also connects the writing assessment and learning process, which are considered equally important (Obeiah & Bataineh, 2016). As for learners, writing portfolios provide them with opportunities to prepare their writing better, such as revising, editing, or getting tutorial help (Tonogbanua, 2018). In other words, portfolio assessment is the process of collecting, selecting, and reflecting on the content of a portfolio (Hamp-Lyons & Con, 2000). According to Hamp-Lyons and Con (2000), a portfolio must embrace a collection of writing that includes a multiplicity of texts. The selection process allows writers to collect their best work. Finally, recognizing writers’ strengths, weaknesses, and needs is the process of reflection. Taken as a whole, this process provides teachers with tangible pictures of students’ writing abilities (Hedge, 2000; Hirvela & Pierson, 2000), arouses students’ learning motivation (Cole et al., 1997), and helps learners become more self-directed (Ballard, 1992; Obeiah & Bataineh, 2016). Nonetheless, this assessment demands time and effort from both teachers and students, which may be viewed as a downside of the assessment (Weigle, 2002).
Writing Portfolio Assessment in EFL Contexts
Over the years, portfolio assessment in teaching writing has received attention from a plethora of researchers (Alam, 2019; Arslan & Gümüş, 2020; Biglari et al., 2021; Chung, 2019; Kalra et al., 2017; Obeiah & Bataineh, 2016; Sulistyo, 2020). The following section reviews some of the key studies in detail.
According to Obeiah and Bataineh (2016), portfolio assessment was not widely used in Jordanian language classrooms while traditional writing strategies and summative tests were dominant. Thus, they conducted a study to examine the effect of portfolio assessment in teaching writing by using a quasi-experimental control/experimental group design with 20 students in each group. The results show that writing portfolio assessment has statistically positive effects on the learners’ writing development. This assessment also promotes critical thinking and autonomous learning among learners. In the Thai context, Kalra et al. (2017) investigated the effects of writing portfolio assessment on 52 senior undergraduates (26 students in the control group and the other 26 in the experimental group). The study applies the framework of the writing process proposed by Gottlieb (2000) to the experimental group, starting with writing the first draft, receiving peer and teacher feedback, revising the draft, and reflecting upon the final draft. The control group, meanwhile, received a traditional teaching assessment. The quantitative results show that the group under portfolio assessment outperformed the control group in writing skills. Thus, the researchers suggest that writing teachers spend time meeting students occasionally during the session in order to prompt students to see their whole learning process and build their confidence.
In addition, Chung (2019) demonstrated the effectiveness of portfolio assessments on EFL students’ writing development with different L1 backgrounds (Taiwanese, Korean, Chinese, Indonesian, Pakistani, Brazilian, and Puerto Rican). These students took foundational English classes in the United States. Similar to Kalra et al. (2017), the process of writing practice encompassed a rough draft, peer feedback and instructor comments, final drafts, and reflection. Chung (2019) found that portfolios allowed students the chance to apply their knowledge through the learning process and keep track of the course materials. Therefore, future learning is promoted by applying this alternative teaching assessment. In the same year, Alam et al. (2019) explored the effects of portfolio assessment by comparing two groups (control and experiment) in the Saudi Arabian context. This is similar to the Jordanian context in that summative assessment was more popular than formative assessment in teaching. The control group of 20 students was instructed to write small essays without asking them to do peer review or reflection. They then had a final test at the end of the course. By contrast, 20 other learners in the experimental group wrote a number of essays, received feedback from the teacher and peers, rewrote papers, and reflected on the learning. As a result, portfolio assessment had a major impact on students’ learning, supporting their autonomy in learning. Notwithstanding, the reliability of the assessment would be an issue if teachers applied this teaching instruction without developing a scoring rubric.
By employing the same methodology as the aforementioned researchers, Arslan and Gümüş (2020) and Sulistyo et al. (2020) explored the learning outcomes and students’ attitudes towards portfolio assessment in the Turkish and Indonesian contexts, respectively. Both studies demonstrate the success of the implementation of portfolio assessment in helping students monitor their learning process and increasing their learning motivation. In particular, Arslan and Gümüş (2020) discovered that Turkish students performed better on spelling and sentence structure, while Sulistyo et al. (2020) found that Indonesian students performed better on the sub-skills of content and organization. Most recently, Biglari et al. (2021) applied the same method as other researchers to explore the effects of portfolio assessment on Iranian EFL students’ autonomy and writing skills. In alignment with the findings of Sulistyo et al.’s (2020) study, students show greater performance in content and organization in comparison with mechanics and grammar. Moreover, portfolio assessment makes students more independent and confident.
In summary, all the above researchers have found that writing portfolios have a positive impact on the improvement of students’ writing performances in different EFL contexts. It helps learners obtain a more in-depth understanding of their writing abilities through the writing process (Sulistyo et al., 2020), in lieu of receiving scores on a timed essay. Sharing the same cultural and educational norms as the earlier studies, formative portfolio assessment has not been addressed comprehensively in the Vietnamese context (Nguyen & Truong, 2021; Tran & Duong, 2020). According to Tonogbanua (2018), examination is one of the reasons for the dominance of summative assessment for years. This inherent situation in Vietnamese education does not help students make significant progress in their writing development. As mentioned by Nguyen (2009), tests could not show “all aspects of the learning process, let alone its hindering students from writing effectively under test conditions” (p. 64). In addition, the minimal knowledge of portfolio assessment among teachers may affect their assessment practice. This issue has been raised by recent researchers who have found that not many second language teachers in general and Vietnamese teachers in particular have adequate experience and knowledge of writing assessments (Crusan et al., 2016; Le, 2015), especially using the assessment techniques of portfolio (Ai et al., 2019; Nguyen & Truong, 2021). According to Nguyen and Truong (2021), Vietnamese teachers support alternative assessment in teaching writing, but they show limited knowledge and experience of this assessment. As a consequence, the study suggests more research is needed into the issues that teachers and students face when using formative assessment. According to Lam (2020), these two concerns, including the systemic issue (exam-oriented culture) and the individual issue (teacher assessment knowledge and skills), encourage the current study to apply portfolio assessment to teaching writing to Vietnamese learners. As for the research method, since most previous studies applied quantitative method, the current qualitative study explores students’ perceptions of portfolio assessment and how it affects their writing by looking at their writing practices in detail and problems emerging during the implementation process through observations and interviews, with the hope of providing writing teachers with theoretical and practical knowledge of portfolio assessment applied to EFL writing classes.
Methodology
Participants
There were 13 college students aged 18 to 22 (5 males and 8 females) participating in this study. Of note is that these students were chosen and invited to the semi-structured interviews because they attended all the classes through the semester and submitted all the assignments. Thus, they were qualified for data collection. At the time of data collection, they were learning writing II (focus on paragraphs) at a private university in Ho Chi Minh City, Vietnam. It is worth noting that there are four obligatory English writing courses, ranging from low (I), intermediate (II), advanced (III), to high advanced (IV) levels, for the first two years of the English language program. These 13 students were called Participants One to Thirteen. These students have not taken any English language tests, so the researcher asked them to write a pre-writing task to ascertain their writing ability. Based on the results, students’ writing abilities were low in general – the average writing score was around 5 out of 10 (Appendix B – Table 3). Only two students (Participants Eleven and Twelve) received scores of more than seven. Organization and grammar/spelling were the two sub-writing skills that students had the most trouble with. With regard to the history of students’ learning, they were familiar with teacher-centered instruction and had studied English at high school for the purpose of exams, which were mostly focused on final exams.
Instruments
To answer the research questions “students’ perceptions of portfolio assessment” and “problems encountered throughout the process,” the researcher collected emic data from the pre- and post- writing tasks, students’ papers written during the portfolio project, observations, and semi-structured interviews. Before the project started (February 10, 2021), the researcher introduced the project to participants, and the consent statement was obtained on February 3, 2021. Table 1 presents an overview of the process, followed by the descriptions of each instrument of data collection. It is hoped that such triangulation supports the credibility, transferability, and dependability of qualitative research (McKay, 2006).
Table 1. Calendar of the Project
Date | Procedure |
Feb 3, 2021 | Project introduction and course introduction |
Feb 10, 2021 | Consent statements Pre-writing task |
Feb 17 – May 5, 2021 | Students’ papers written during the portfolio project Observations (First draft, peer review, teacher feedback, revision) |
May 12, 2021 | Post-writing task |
May 19, 2021 | Interviews |
Pre- and Post-Writing Tasks. Regarding the students’ writing abilities, the researcher conducted a pre-writing task that focused on a paragraph at the beginning of the project (week one). The reason for this focus is that students had studied Writing I, which is at a low level, and they have not studied writing essays yet (Writing III). The teacher asked students to write a paragraph to talk about their university life, so they would have ideas to write and express their thoughts to the reader – the instructor. Students were given 40 minutes to write in the class under the teacher’s supervision. Before pupils began writing, the teacher presented the task and directions to ensure that no one was confused about the writing task. At the end of the term, students were asked to do a post-writing task with a different topic (what does the word “give up” mean to you) under the same time constraint (40 minutes). The purpose of this task is to help students know what they have learned and achieved after a period of practicing writing, and they can share their perceptions about portfolio assessment in the interviews. Thus, the researcher did not focus on statistical analysis for these scores. In other words, this was a semi-longitudinal study that used a pre-post-writing task design, but was supported mainly with qualitative data (observations, interviews, and students’ written texts).
Students’ Papers Written During the Portfolio Project. From week three to week 12, the study adapted the writing portfolio framework developed by Hamp-Lyons and Con (2000) for teaching and assessment with three components, including collection, selection, and reflection. In particular, students were asked to brainstorm ideas in the class and write drafts at home, do peer review, revise drafts, and then submit them to the teacher. Then the teacher commented on the students’ papers and spent 30 minutes answering students’ concerns in the class. The teacher applied the commentary sandwich style indicated by Ferris (2008) to engage students and avoid assignment grading anxiety (Ferris & Hedgcock, 2014; Weigle, 2007). The teacher began and ended the feedback with positive remarks and provided some constructive criticism or suggestion points in the middle without showing grades. All the comments were related to surface levels (grammar, spelling, and word choice) and deep levels (content and structure of the writing product). The final task was revision. When this process was completed, the teacher moved on to the second round and continued until the end of the course. In order to have an efficient and effective writing process (Schmitt & Rodgers, 2020), the teacher allowed students to write their drafts in one week, giving another week for feedback and revisions. Students were allowed to select their best paper written during the course as their final test and provide reflection. The whole writing process is described in Figure 1.
Figure 1. Writing Portfolio Assessment Model
Observation. As for the observations, the teacher observed the classroom and the attitudes of the pupils towards each step of the writing process. For example, the teacher looked at how students worked together when being asked to do peer review and brainstorm ideas; how they responded and acted when receiving feedback; and how they performed in their papers throughout the course. The observations were noted by the researcher after every class and writing round.
Interviews. When it comes to interviews, students were asked about their perceptions of portfolio assessment, problems encountered during the implementation, and the teacher’s qualities. Semi-structured interviews and L1 language (Vietnamese) were applied and used in order to help respondents express their perceptions openly and naturally and provide more in-depth information. Students did not know the interviewer (she teaches at a different school), so they would feel free to share their responses, which could support the trustworthiness of the answers. Also, the interviewer tried to listen to their responses about the writing portfolio assessment applied throughout the semester. Following their responses, the interviewer asked more questions for further information. The 13 students were invited to a semi-structured interview (20 to 30 minutes for each student) with their agreement, and the schedule of time was negotiated (two sections: morning and afternoon) before the interviews. The interviewer recorded and took notes simultaneously when interviewing students for further analysis. The total number of words translated from Vietnamese to English is 1,515. It is important to keep in mind that students’ responses were transcribed and then translated from Vietnamese to English, so the transcripts shown in this study were not the students’ original words.
Data Analysis
To answer the research questions, the results of the pre- and post-writing tasks and papers written during the project were analyzed. In particular, all these papers were marked according to the writing rubric with five criteria, namely focus, content, organization, grammar and spelling, and word usage, developed by Wang and Liao (2008). Table 2 shows an example of the sub-skill of content (detailed information can be found in Appendix A). This rubric was adopted because it meets the purposes of portfolio assessment, which yields students comprehensive feedback about their writing strengths and weaknesses based on their performances in the five different sub-skills. In other words, students will be aware of what they have done well and what areas need improvement (McTighe & Ferrara, 2021). Furthermore, using the rubric to assess students’ work increases the reliability and validity of the assessment (Alam & Aktar, 2019; Bachman & Palmer, 1996; Ferris & Hedgcock, 2014; Kalra et al., 2017). Students then all receive the same evaluation, even if they were evaluated on several occasions (Weigle, 2002). The teacher collected the students’ assignments, returned the feedback to them, and advised them to keep their papers until the end of the course.
Table 2. Content Sub-Skill Proposed by Wang and Liao (2008)
Score | Content |
10-9 | Using specific appropriate details to support topics or illustrate ideas |
8-7 | Using appropriate details to support topics or illustrate ideas |
6-5 | Using some details to support topics or illustrate ideas |
4-3 | Using inappropriate or insufficient details to support topics or illustrate ideas |
2-1 | Using few or no details or irrelevant details to support topics or illustrate ideas |
In addition to assessing students’ papers, observations throughout the project and interviews at the end of the course were conducted. When the researcher had all the data, the transcripts and notes were then selected and analyzed by two relevant themes associated with the students’ perceptions of portfolio assessment on their writing development and problems emerging during the process. Importantly, students’ responses in the interviews were analyzed in parallel with their written papers. For instance, if they talked about the benefits of written portfolio assessment regarding organization, the researcher would check their papers on this writing element to see whether they were consistent. Likewise, the practical concerns related to the writing portfolio assessment shared by students would be analyzed and compared with the teacher’s observations.
Results and Discussion
Students’ Perceptions of Writing Portfolio Assessment
Students in this study have positive perceptions of writing portfolio assessment, which is congruent with the findings of Arslan and Gümüş’s (2020), Obeiah and Bataineh’s (2015), and Sulistyo et al.’s (2020) studies in EFL contexts. In particular, the writing portfolio assessment provides students with a comprehensive knowledge of academic writing with the five sub-writing skills by providing and following a writing rubric. All participants reported that they received detailed feedback from the teacher, which helped them improve their writing skills in a holistic way. Importantly, students understood that they needed to consider not only the micro-writing levels (grammar, spelling, and word choice) but also the macro-writing levels (content, organization) when practicing writing academic papers.
Teacher feedback is very objective by following five criteria that were introduced at the beginning of the course. (Participant Two)
The teacher followed five criteria to evaluate our writing. It helped me clearly understand the feedback and professionally improve my writing. Before, I just knew my writing scores without any comments and knowing those writing elements. (Participant Four)
Under the instruction of the teacher, I understand that I need to pay attention to all five sub-skills when writing. (Participant Ten)The teacher strictly followed five writing criteria when giving feedback on my paragraphs, so this encouraged me to be more responsible for my writing. (Participant Eleven)
In comparison to their first writing, students performed better, as seen in Table 4 (Appendix B). In the first writing task, only two students (Participants Eleven & Twelve) received scores of around seven, but eight students received scores of seven and higher at the end of the course (post-writing). Although the thirteen students’ average scores on organization and grammar were the lowest, these two sub-skills demonstrated significant improvement (Tables 3 & 4, Appendix B). It means that organization and grammar are two of the most difficult sub-skills among these low-level students, which should be taken into consideration by language teachers. Since EFL teachers in general (Jaliyya & Idrus, 2017; Nguyen & Habok, 2021) and Vietnamese teachers in particular (Pham & Truong, 2021) focused more on correcting linguistic forms, the finding of this study indicates evidence that the organization sub-skill of a written text should not be overlooked. This demonstrates that students had trouble structuring and integrating their ideas in writing. In fact, the majority of students used to write without paying attention to organizing their ideas logically. However, after several rounds of practicing writing under the portfolio culture assessment, students started focusing more on the sub-skill of organization, which meets the findings of Obeiah and Bataineh’s (2015) study.
Before taking this course, I used to write anything that came to mind without properly organizing my thoughts, and I had no idea how to organize and express my ideas in English. (Participant One)
I used to write based on my emotions, which resulted in a lack of consistency in my work. I didn’t know how to put the concepts together in a logical order. This activity has not been addressed carefully before. (Participant Nine)
In particular, the students in this study did not determine and pay much attention to the connections between topic sentences, supporting sentences, and concluding sentences when writing in the first few weeks. This issue has been highlighted by Negari (2011) and Nguyen (2021), who found that students lacked knowledge and skills in organizing ideas. It can be recognized that the exam-oriented approach does not provide students with comprehensive knowledge about academic writing since the students in this study did not pay attention to the organization of a paper or did not know how to organize and develop the ideas. This is the first written performance of Participant Two: “I am studying at X university. This university is a place that teach students a lot of things. There are many clubs help students to study. Students always help each other to achieve best result. Sport club are …. Teachers are friendly…” It is clear to see that the student listed the ideas without organizing and developing them. Not only does Participant Two have this problem, but it was also found by others in their pre-writing task. According to Schmitt and Rodgers (2020), the entire text must be coherent, which means that “various parts of the text have to work together conceptually in the particular rhetorical context” (p. 284). Considering this problem, the teacher provided students with some writing examples and analyzed the organization and structure of the writing thoroughly. Weeks later, students performed better, as seen by this performance: “Courage is mental or moral strength when people are facing and deciding to do something in dangerous situations. For example, in the Covid 19 pandemic, many doctors, nurses…. They are the frontline to work hard and ready to face the risks from caring for the infected patients…” (Participant Two). It can be seen that the student knew how to clarify and develop the major topic sentence and explain it by giving an example of doctors who work in dangerous situations. This improvement has also been recognized by other students in the following comments:
Organization was one of the sub-skills that I improved most after taking this course. (Participant Four)
After taking this course, I gained a new perspective on how to write an English paragraph. The writing’s coherence and cohesion were also improved. (Participant Five)
What I was concerned about when taking this course was how I could improve the organization and conventions of writing. Now I’m able to write the sentences in a more logical manner. I was taught how to focus on the key points and make the paragraph more attractive to readers. (Participant Eleven)
I have trouble developing the ideas that result in my continuously off-topic essays. Organization is the sub-skill that I improved significantly. I can identify the structure of writing and know how to outline a writing topic. (Participant Twelve)
By the same token, participants’ grammar ability greatly improved over the course of the study by looking at their written performances every week. Students noticed their grammatical errors and reduced them in the next writing assignments during the portfolio assessment project. In their early work, for example, students frequently made grammatical errors related to verb agreement, tense, spelling, and adjectives and nouns (e.g., my university life is very difference; when I have difficult). In alignment with Biglari et al.’s (2021) finding, Iranian EFL learners had the same errors.
The aspect that I liked most about the portfolio assessment was that I received feedback from my teacher. He spent lots of time figuring out my common mistakes, which prevented me from making them again. (Participant Six)
I understand and am more confident in starting to write now. During this portfolio project, I recognized my common grammatical errors. (Participant Nine)
It could be recognized that the teacher’s role during the writing portfolio process is critical in helping students develop their writing, particularly for low-proficiency pupils. The support from the teacher gave students more confidence, informed them about the writing routine, and helped them recognize their common writing mistakes. Talking with teachers, in other words, allows students to reflect on their own learning progress (Farr, 1990). As mentioned by Murphy (1992), teachers’ comments not only guided and assisted students during the learning process but also aided them to recognize their writing weaknesses and strengths. Besides, receiving teacher feedback helps students enrich their learning with new things, perspectives, suggestions, and support (Abhakorn, 2014), allowing them to gain a better understanding of their writing abilities and performance as writers (Hirvela & Pierson, 2000). These roles appear to be appropriate for low achievers since they foster their learning confidence by allowing them to share their writing difficulties with the teacher. It is important to bear in mind that giving students opportunities to self-regulate, revise, and reflect on their learning after receiving feedback is supported. Implicit or indirect feedback, for instance, could be a good option to encourage students’ self-learning. By doing so, the teacher under the writing portfolio assessment is not only a provider but also a facilitator, which is different from the teacher-centered role in traditional assessment (Schiro, 2013). The following comments demonstrate those perspectives:
I’m more confident in my writing now. I like to receive feedback so that I can notice my errors and reduce making them again in the following writing assignment. (Participant One)
What I liked most about this assessment is that I received feedback on the mistakes I made in my writing. (Participant Eleven)
Now, I’m familiar with the routine of writing process. The teacher was willing to support me at any time. He gave me opportunities to express my opinions on the writing and then advised me to make it better. His feedback on my writing is useful and supportive for the revisions. (Participant Twelve)
All in all, the writing portfolio assessment provides the teacher with some major writing problems from students’ performances during the process of practice. As for students, it helps them improve their sub-writing skills in a comprehensive way with the support of the teacher during the project and arouses their learning motivation. Nevertheless, there are some practical concerns emerging during the implementation process, which will be presented in the following section.
Practical Concerns
There are some issues that need to be taken into consideration by writing teachers, especially if writing portfolio assessment is applied to lower-level proficiency students. These are idea development, plagiarism knowledge, self- and peer-assessment, and the writing process.
First and foremost, the lower-level students in this study usually struggle with idea development due to a lack of reading habits and vocabulary. Based on the observations, the instructor realized that pupils were having problems with the initial step—brainstorming ideas—when practicing in the class, which took them a long time. Concerning this issue, Participant Eleven, who had a score of seven at the pre-writing task, shared her learning experience: “We should read more to enrich vocabulary, ideas, and learn how to structure writing better through reading books or articles.” Indeed, the majority of students assumed that they did not read frequently, which is one of the reasons why they struggled to come up with topics and ideas for their writing. Reading, according to Grabe and Zhang (2013), has a good impact on writing abilities. In the same vein, Hyland (2019) stated that writing skills could not be developed in isolation and must be supplemented by reading, which provides learners with not only new content knowledge but also rhetorical, structural, grammatical, and vocabulary knowledge. Unfortunately, this issue has not been carefully addressed in the writing classroom. In fact, the English program at the tertiary level offers four courses for four English language skills independently. Then language teachers teach each language skill separately (writing teachers teach writing, reading teachers teach reading). Based on this first concern, the connections between skills should be given more attention, particularly reading and writing.
Second, some students practiced writing at home without being aware of plagiarism, which limits their writing development. Some students used the Internet and Google Translation for their writing when they had difficulty expressing their ideas in English. Some students admitted that they borrowed the ideas in Vietnamese from the websites and then translated them to English. According to Do (2022), this action should be avoided because it is considered plagiarism. This is an urgent issue in Vietnamese education since it has also been found by researchers recently (Do, 2022; Tran, 2021). In order for students to avoid the influence of outside assistance and resources, some students want to have in-class timed writing under the supervision of the teacher (Participants Three, Four, Five, & Seven). For instance: “The teacher should let students write in class frequently to help students work independently, preventing us from using supportive tools such as Google Translation, the Internet, or spelling check when staying at home” (Participant Three) or “writing from home allows us to become less mindful of the time constraint” (Participant Seven). Indeed, this suggestion is both possible and essential for students “to be able to recognize, write, and edit a composition in a relatively short amount of time” (Weigle, 2002, p. 174). However, this second concern lies in the learning behavior and plagiarism knowledge among students. As mentioned earlier, students were familiar with traditional learning and teaching methods (learning by rote) (Nguyen, 2009; Nguyen & Truong, 2021; Pham & Truong, 2021; Tonogbanua, 2018), so the shift to a portfolio culture may lead to this phenomenon.
The third concern is related to self- and peer-assessment. Students at lower levels of language proficiency in this study are usually not confident to provide feedback on their classmates’ writing. These participants were afraid of supplying incorrect comments since they did not know much about their writing abilities. When the teacher observed the classroom in the first few weeks, students just read the papers written by their peers and had no idea how to comment or provide feedback, even the rubric was provided. Most of them did not give any comments on their friends’ papers, and they seemed to be passive in this activity. Along with their low proficiency level, students received little training to provide feedback. As mentioned in the literature review, students regard teachers as supreme authorities, which prevents them from speaking out their opinions, so they would be embarrassed when being asked to do peer reviews, which had never been done before. This is an important issue that should be taken into consideration by language teachers so that they may investigate students’ needs (Do & Cheng, 2021) regarding this problem beforehand to have suitable training for their target students. As Coombe et al. (2012) emphasized, teachers should involve students in the process of assessment to improve language teaching and learning. To do this, teaching students how to engage in healthy self-assessment and effective peer assessment is essential to the process. This third problem was shared by some low-proficiency students:
I was not confident in providing feedback to others. I like receiving feedback from the teacher because I can understand what he means easily. (Participant Two)
Doing peer review is not productive. I’m not sure whether my feedback is right or wrong. (Participant Nine)
Last but not least, the steps of the writing process (planning, drafting, and revising) are not always linear. In fact, these steps seem to be suitable for poor students who follow them step by step. Nevertheless, students at higher levels (Participants Eleven & Twelve) may not always go through those steps in an orderly fashion. They shared in the interviews that the revisions and idea development may have taken place in their minds, so it did not take them much time to follow the order process. As Schmitt and Rodgers (2020) mentioned, “experienced writers writing in a familiar rhetorical situation may be able to rehearse so extensively in their heads that their first drafts require relatively few revisions” (p. 283). As a result, it is important to remain flexible when using the portfolio assessment for students at different proficiency levels (Arslan & Gümüş, 2020).
To summarize, writing portfolio assessment should be applied efficiently and effectively, with special attention to things like idea development, plagiarism knowledge, self- and peer-assessment, and writing procedures.
Implications
According to the results of the current study, it is suggested that the application of portfolio assessment can be beneficial in the teaching of EFL writing, especially with lower-level students. Importantly, writing teachers may consider the four thorny issues when applying this type of assessment to particular learners with the following practical suggestions. Accordingly, group work or working in pairs is highly recommended as the main activity in writing classes (brainstorming, idea outlines), which lets students gain more comfort and ease in building and developing ideas, where a learner-centered approach is encouraged. In addition to writing tasks, read-to-write activities such as reading for pleasure and sharing (Do & Phan, 2021) would benefit their development of ideas, vocabulary, organization, and so on. In other words, reading could be an extra activity that should not be ignored in writing classes. Teachers may provide relevant reading with writing topics so that students will be able to develop their ideas and strengthen their literacy skills (Ferris & Hedgcock, 2014). Besides, selecting popular writing topics with students at low levels is recommended in order to let them incorporate their own personal experiences and perspectives into their writing.
Concerning the assistant tools of external resources when writing at home, plagiarism tutorials and guidance on how to use the information professionally without violating should be provided. This also increases students’ self-learning attitudes. Administrators should consider this problem and equip plagiarism detection tools to support teachers in teaching writing. In-class timed writing could also be included in the writing portfolio process (a mid-term) to evaluate students’ writing abilities after practicing several rounds of writing under the portfolio assessment. It is important to keep in mind that the purpose of this activity is to let students know what they have done well and what areas they need to improve for the rest of the unit under the time constraint. With regard to self- and peer-assessment, detailed instructions about writing rubrics and training should be provided to help students make suggestions for their revisions and know how to provide feedback to their peers. In doing so, the rubric is both a teaching and testing tool, which informs self-reflection habits in students and helps them understand teachers’ expectations. Peer review can be completed prior to teacher feedback to give students more chances to think about their writing and discuss ideas with their friends. As a result, developing learners’ self-reflective abilities should be prioritized (Lam, 2020). These suggestions may appear to be knowledgeable, understandable, and practical in some contexts. However, they are significant in contexts where traditional assessment and teacher-centered approaches predominate, such that students receive little training in self- and peer-assessment and have few opportunities to practice writing throughout the portfolio assessment process in order to become competent writers. Hence, those suggestions may help teachers avoid the four aforementioned practical issues when applying portfolio writing assessment to the context where students were previously familiar with summative assessment.
For the writing process, to manage the workload – the downside of portfolio assessment (Nhung, 2016; Ferris & Hedgcock, 2014) and give students ample time to let their papers “sit for a while” (Schmitt & Rodgers, 2020, p. 283) and reflect carefully on their writing (Chung, 2019; Kalra et al., 2017), the process of a written text could be divided into three tasks (pre-task, task completion, task review) (Christison & Murray, 2014) in two weeks. In the pre-task, teachers give students a writing topic and ask them to brainstorm ideas about the assigned task individually or with their peers. Students then write their first draft individually (task completion). When it comes to task review, students send their papers to their friends (make sure the writing rubric and guidance are provided). After that, students revise their drafts and submit them to the teacher. According to the current study, there are some suggestions for teachers who want to carry out these tasks effectively. The first and second tasks could be done in one week, and the second week is for feedback and revisions. It is helpful to ask pupils to write the first draft in class, then the revising step can be done at home. In case teachers do not have much time for the first draft, which is required to write in the classroom, a 10- or 15-minute composition is suggested (Hartshorn et al., 2010). Teachers may also consider having a break (one week) after finishing a writing assignment with three tasks to refresh students’ minds and reduce the workload for themselves before moving on to the next writing topic or genre.
Conclusion
The current study proves the usefulness of portfolio assessment in teaching academic writing, which is applicable to language teachers in EFL contexts. This product assessment improves students’ writing significantly with the five sub-skills of academic writing (writing goal – validity). Importantly, students understand the importance of logical organization and recognize some common grammatical errors (identify and improve weaknesses – impact). Furthermore, this teaching assessment provides students with a comfortable learning atmosphere, allowing them to understand their writing abilities and engage in the writing process (interactively). Thus, students support teachers in applying this teaching instruction to future writing courses (students’ needs and wants – authenticity): “I have a strong motivation to take another writing course in which the writing portfolio is integrated” (Participant Four), and “I am looking forward to taking the class using the writing portfolio approach in the upcoming semesters” (Participant Ten). Having said that, the assessment should be applied effectively and efficiently to improve the reliability and practicality of the assessment by considering some important issues as mentioned in the implication section. For instance, group work is effective for brainstorming, but peer assessment and self-assessment need more training beforehand. Additionally, lessons on plagiarism ought to be a part of writing instruction, which raises students’ awareness and learning attitudes. Once students are aware of this academic dishonesty, they could become autonomous writers. Otherwise, the practice under portfolio assessment may become meaningless. Last but not least, time and the process of writing should be considered flexible to target students and special classroom settings. For future researchers, they might consider enlarging the number of participants at different levels of English language proficiency (advanced students) to expand the transferability of the research, which is considered one of the limitations of the study. Also, studies investigating the problems of the writing portfolio assessment in other contexts where summative assessment is dominant, are recommended for further exploration. This could provide us with tangible pictures of the problems of portfolio assessment when applying it to different EFL learners, and then prevent writing teachers from those issues with effective strategies and solutions.
About the Authors
Hieu Manh Do is a Doctoral candidate (Doctor of Arts in English pedagogy – TESOL) at Murray State University, USA. He earned his master’s degree in TEFL from National Chung Cheng University, Taiwan. His main interests include Second Language Writing, Methods for Teaching ESL/EFL, English for Specific Purposes (ESP), and English for Academic Purposes (EAP). ORCID ID: 0000-0002-7780-263X
Acknowledgments
I would like to thank Do Mai Chi (Western Hanoi School) who helped me interview students, and the three anonymous reviewers of TESL-EJ for their valuable comments on earlier drafts of this article – they have really helped to shape this paper for the better.
To Cite this Article
Do, H. M. (2023). Pedagogical benefits and practical concerns of writing portfolio assessment: Suggestions for teaching L2 writing. Teaching English as a Second Language Electronic Journal (TESL-EJ), 27 (1). https://doi.org/10.55593/ej.27105a4
References
Abhakorn, M. J. (2014). Investigating the use of student portfolios to develop students’ metacognition in English as a foreign language learning. Journal of Language Teaching and Research, 5(1), 46. https://doi.org/10.4304/jltr.5.1.46-55
Abrar-ul-Hassan, S., Douglas, D., & Turner, J. (2021). Revisiting second language portfolio assessment in a new age. System, 103, 102652. https://doi.org/10.1016/j.system.2021.102652
Alam, M., & Aktar, T. (2019). Assessment challenges & impact of formative portfolio assessment (FPA) on EFL learners’ writing performance: A case study on the preparatory English language course. English Language Teaching, 12(7), 161-172. https://doi.org/10.5539/elt.v12n7p161
Arslan, R. Ş., & Gümüş, S. N. (2020). The effect of portfolio-keeping on EFL young learners’ writing achievement and their motivation towards writing. Journal of Narrative and Language Studies, 8(14), 130-150. https://hdl.handle.net/11499/37133
Ávila, J. (2012). The fight’s not always fixed: Using literary response to transcend standardized test scores. English Journal, 101-107. https://www.jstor.org/stable/23365405
Bachman, L. F., & Palmer, A. S. (1996). Language testing in practice: Designing and developing useful language tests (Vol. 1). Oxford University Press.
Ballard, L. (1992). Portfolios and self-assessment. English Journal, 81, 46-48. https://www.jstor.org/stable/819972
Barnes, J., Colquhoun, J., Devlin, M., Heels, L., Lord, P., Marshall, L., & Witty, C. (2020). Designing a portfolio-oriented curriculum using problem-based learning. In Proceedings of the 4th Conference on Computing Education Practice 2020 (p. 1-4). https://doi.org/10.1145/3372356.3372367
Biglari, A., Izadpanah, S., & Namaziandost, E. (2021). The effect of portfolio assessment on Iranian EFL learners’ autonomy and writing skills. Education Research International. https://doi.org/10.1155/2021/4106882
Christison, M. & Murray, D. E. (2014). What English language teachers need to know Vol. III: Designing curriculum. Routledge.
Chung, S. J. (2019). The role of self-reflective essays in ESL writing portfolio. Korean Journal of General Education, 13(6). https://j-kagedu.or.kr/upload/pdf/kagedu-13-6-315.pdf
Cole, K. B., Struyk, L. R., Kinder, D., Sheehan, J. K. & Kish, C. K. (1997). Portfolio assessment: Challenges in secondary education. The High School Journal, April/May, 261-272. https://www.jstor.org/stable/40364458
Coombe, C., Davidson, P., O’Sullivan, B., & Stoynoff, S. (Eds.). (2012). The Cambridge guide to second language assessment. Cambridge University Press.
Crusan, D. (2010). Assessment in the second language writing classroom. University of Michigan Press.
Crusan, D., Plakans, L., & Gebril, A. (2016). Writing assessment literacy: Surveying second language teachers’ knowledge, beliefs, and practices. Assessing Writing, 28, 43-56. https://doi.org/10.1016/j.asw.2016.03.001
Dinh, T. N. (2016). A study on the application of writing portfolio technique to second year English majors. Master’s thesis, Vietnam National University, Hanoi. http://repository.vnu.edu.vn/handle/VNU_123/40402
Do, H. M. (2022). Plagiarism tutorials in teaching academic writing for L2 Vietnamese undergraduates. International TESOL Journal & Technology Journal, 17(1), 44-62. https://connect.academics.education/index.php/itj/article/view/132
Do, H. M., & Cheng, Y. H. (2021). Needs analysis of hotel front desk staff: Considerations for ESP course design. International Journal of English for Specific Purposes, 1(1), 69-92. https://connect.academics.education/index.php/ijesp/article/view/149/85
Do, H. M., & Phan, H. L. T. (2021). Metacognitive awareness of reading strategies on second language Vietnamese undergraduates. Arab World English Journal, 12(1), 90-112. https://doi.org/10.24093/awej/vol12no1.7
Ferris, D., & Hedgcock, J. (2014). Teaching L2 composition: Purpose, process, and practice. (3rd ed). Routledge.
Genesee, E., & Upshur, J. (1996). Classroom-based evaluation in second language education. Cambridge University Press.
Gholami, H. (2016). Self-assessment and learner autonomy. Theory and Practice in Language Studies, 6(1), 46 – 51. https://doi.org/10.17507/tpls.0601.06
Grabe, W., & Zhang, C. (2013). Reading and writing together: A critical component of English for academic purposes teaching and learning. TESOL Journal, 4(1), 9-24. https://doi.org/10.1002/tesj.65
Hamp-Lyons, L., & Con, W. (2000). Assessing the portfolio principles for practice, theory, and research. Hampton Press.
Hartshorn, K. J., Evans, N. W., Merrill, P. F., Sudweeks, R. R., Strong‐Krause, D., & Anderson, N. J. (2010). Effects of dynamic corrective feedback on ESL writing accuracy. TESOL Quarterly, 44(1), 84-109. https://doi.org/10.5054/tq.2010.213781
Hedge, T. (2000). Teaching and learning in the language classroom. Oxford
Hirvela, A., & Pierson, H. (2000). Portfolios: Vehicles for authentic self-assessment. Learner-directed assessment in ESL, 105-126.
Hyland, K. (2019). Second language writing. Cambridge University Press.
Jaliyya, F. G., & Idrus, F. (2017). EFL students’ attitudes and perception towards English language learning and their English language proficiency: A study from Assa’adah Islamic boarding school, Indonesia. Journal of Education and Learning, 11(3), 219-228. https://doi.org/10.11591/edulearn.v11i1.4621
Kalra, R., Sundrarajun, C., & Komintarachat, H. (2017). Using portfolio as an alternative assessment tool to enhance Thai EFL students’ writing skill. Arab World English Journal (AWEJ) Volume, 8. https://doi.org/10.24093/awej/vol8no4.20
Lam, R. (2020). Writing portfolio assessment in practice: Individual, institutional, and systemic issues. Pedagogies: An International Journal, 15(3), 169-182. https://doi.org/10.1080/1554480X.2019.1696197
Le, N. T. (2015). The contexts of assessment in EFL classrooms in two high schools in Vietnam. PhD Thesis, School of Education, the University of Queensland.
https://doi.org/10.14264/uql.2015.891
McKay, S. L. (2006). Researching second language classrooms. Routledge.
McTighe, J., & Ferrara, S. (2021). Assessing student learning by design: Principles and practices for teachers and school leaders. Teachers College Press.
Murphy, S. (1992). What goes into portfolios? Writing portfolios. Pippin Publishing Limited.
Negari, G. M. (2011). A study on strategy instruction and EFL learners’ writing skill. International Journal of English Linguistics, 1(2), 299–307. https://doi.org/10.5539/ijel.v1n2p299
Nezakatgoo, B. (2011). The effects of portfolio assessment on writing of EFL students. English Language Teaching, 4(2), 231-241. https://doi.org/10.5539/elt.v4n2p231
Nguyen, H. H. T. (2009). Teaching EFL writing in Vietnam: Problems and solutions – a discussion from the outlook of applied linguistics. VNU Journal of Foreign Studies, 25(1). https://js.vnu.edu.vn/FS/article/view/2236
Nguyen, T. H. H., & Truong, A. T. (2021). EFL teachers’ perceptions of classroom writing assessment at high schools in central Vietnam. Theory and Practice in Language Studies, 11(10), 1187-1196. https://doi.org/10.17507/tpls.1110.06
Nguyen, T. T. L. (2021). Learning EFL writing in Vietnam: Voices from an upper-secondary school’s students. The Journal of Asia TEFL, 18(4), 1195-1210. https://doi.org/10.18823/asiatefl.2021.18.4.8.1195
Nguyen, V, S., & Habók, A. (2021). Students’ beliefs about teachers’ roles in Vietnamese classrooms. Electronic Journal of Foreign Language Teaching, 18(1), 38-59. http://publicatio.bibl.u-szeged.hu/id/eprint/23095
Obeiah, S. F., & Bataineh, R. F. (2016). The effect of portfolio-based assessment on Jordanian EFL learners’ writing performance. Bellaterra Journal of Teaching & Learning Language & Literature, 9(1), 32-46. https://raco.cat/index.php/Bellaterra/article/view/306905
Pham, V. P. H., & Truong, M. H. (2021). Teaching writing in Vietnam’s secondary and high schools. Education Sciences, 11(10), 632. https://doi.org/10.3390/educsci11100632
Schiro, M. S. (2013). Curriculum theory: Conflicting visions and enduring concerns. Sage.
Schmitt, N. & Rodgers, M. P. H. (Eds.) (2020). An introduction to applied linguistics (3 rd. ed) Routledge. ISBN: 9781138290136.
Stewart, R. A., Aegerter, J., Davis, D., & Walseth, B. (1993). Have you read? Portfolios: Agents of change. The Reading Teacher, 46(6), 522-524. https://www.jstor.org/stable/20201120
Sulistyo, T., Eltris, K. P. N., Mafulah, S., Budianto, S., Saiful, S., & Heriyawati, D. F. (2020). Portfolio assessment: Learning outcomes and students’ attitudes. Studies in English Language and Education, 7(1), 141-153. https://doi.org/10.24815/siele.v7i1.15169
Tabatabaei, O., & Assefi, F. (2012). The effect of portfolio assessment technique on writing performance of EFL learners. English Language Teaching, 5(5), 138-147. https://doi.org/10.5539/elt.v5n5p138
Tonogbanua, J. R. (2018). Exploring collaborative e-portfolio project for teaching and learning academic writing. Asian EFL Journal, 20(12), 173-193.
Tran, M. (2021). Student perceptions of plagiarism: A study of Vietnam- and New Zealand-educated postgraduate students. Doctoral dissertation, Open Access Te Herenga Waka- Victoria University of Wellington. https://doi.org/10.26686/wgtn.14159408.v2
Tseng, C. C. (2019). Senior high school teachers’ beliefs about EFL writing instruction. Taiwan Journal of TESOL, 16(1), 1-39. https://doi.org/10.30397/TJTESOL.201904_16(1).0001
Wang, Y. H., & Liao, H. C. (2008). The application of learning portfolio assessment for students in the technological and vocational education system. Asian EFL Journal, 10(2), 132-154.
Weigle, S. C. (2002). Assessing writing. Cambridge University Press.
Weigle, S. C. (2007). Teaching writing teachers about assessment. Journal of Second Language Writing, 16(3), 194-209. https://doi.org/10.1016/j.jslw.2007.07.004
Appendix A. Writing Scoring Rubric (Wang & Liao, 2008)
Criteria | Score | |
Focus | Specifically addressing the writing task | 10-9 |
Addressing most of the writing task | 8-7 | |
Addressing the writing task adequately but sometimes straying from the task | 6-5 | |
Inadequately addressing the writing task | 4-3 | |
Having problems with focus or failing to address the writing task | 2-1 | |
Content | Using specific appropriate details to support topics or illustrate ideas | 10-9 |
Using appropriate details to support topics or illustrate ideas | 8-7 | |
Using some details to support topics or illustrate ideas | 6-5 | |
Using inappropriate or insufficient details to support topics or illustrate ideas | 4-3 | |
Using few or no details or irrelevant details to support topics or illustrate ideas | 2-1 | |
Organization | Being specifically well-organized and well-developed | 10-9 |
Being generally well-organized and well-developed | 8-7 | |
Mostly well-organized and well-developed | 6-5 | |
Being inappropriately well-organized and well-developed | 4-3 | |
Being seriously disorganized or underdeveloped | 2-1 | |
Spelling & grammar | Spelling and grammar are perfect or nearly perfect | 10-9 |
Spelling and grammar are almost accurate | 8-7 | |
Spelling and grammar are fair with some minor errors | 6-5 | |
Spelling and grammar are inappropriate with obvious errors | 4-3 | |
Spelling and grammar are poor with frequent errors | 2-1 | |
Word usage | Using specific appropriate words | 10-9 |
Using appropriate words | 8-7 | |
Using adequate but sometimes inappropriate words | 6-5 | |
Using noticeably inappropriate words | 4-3 | |
Containing severe writing errors | 2-1 |
Appendix B. Pre- and Post-Writing Tasks
Table 3. The Scores of the Pre-Writing Task
Participant |
Sub-skills | Mean |
||||
Focus | Content | Organization | Grammar & spelling |
Word usage | ||
1 | 5 | 5 | 3 | 3 | 3 | 3.8/10 |
2 | 4 | 4 | 2 | 3 | 3 | 3.2/10 |
3 | 5 | 5 | 4 | 5 | 5 | 4.8/10 |
4 | 3 | 3 | 2 | 3 | 3 | 2.8/10 |
5 | 5 | 5 | 4 | 3 | 5 | 3.4/10 |
6 | 4 | 4 | 4 | 2 | 5 | 3.8/10 |
7 | 7 | 7 | 7 | 6 | 6 | 6.6/10 |
8 | 6 | 7 | 6 | 6 | 6 | 6.2/10 |
9 | 6 | 6 | 4 | 4 | 5 | 5/10 |
10 | 7 | 6 | 5 | 5 | 6 | 5.8/10 |
11 | 7 | 8 | 6 | 7 | 7 | 7.0/10 |
12 | 7 | 7 | 7 | 8 | 7 | 7.2/10 |
13 | 6 | 6 | 6 | 5 | 6 | 5.8/10 |
Mean | 5.5 | 5.6 | 4.6 | 4.6 | 5.2 | Apx 5.1/10 |
[back to article]
Table 4. The Scores of the Post-Writing Task
Participant | Sub-skills | Mean | ||||
Focus | Content | Organization | Grammar & spelling |
Word usage | ||
1 | 6 | 7 | 6 | 6 | 7 | 6.4/10 |
2 | 7 | 7 | 6 | 6 | 7 | 6.6/10 |
3 | 7 | 7 | 8 | 7 | 7 | 7.2/10 |
4 | 7 | 7 | 7 | 6 | 7 | 6.8/10 |
5 | 8 | 8 | 8 | 7 | 8 | 7.8/10 |
6 | 6 | 6 | 5 | 6 | 6 | 5.8/10 |
7 | 7 | 7 | 7 | 7 | 7 | 7/10 |
8 | 8 | 7 | 7 | 7 | 7 | 7.2/10 |
9 | 8 | 8 | 7 | 7 | 7 | 7.4/10 |
10 | 8 | 8 | 7 | 8 | 7 | 7.6/10 |
11 | 9 | 9 | 9 | 8 | 8 | 8.6/10 |
12 | 7 | 8 | 8 | 7 | 7 | 7.4/10 |
13 | 6 | 6 | 6 | 6 | 6 | 6/10 |
Mean | 7.2 | 7.3 | 7.0 | 6.8 | 7.0 | 7.06/10 |
Copyright of articles rests with the authors. Please cite TESL-EJ appropriately. Editor’s Note: The HTML version contains no page numbers. Please use the PDF version of this article for citations. |