* * * On the Internet * * *
November 2025 — Volume 29, Number 3
https://doi.org/10.55593/ej.29115int2
Rachel Toncelli
Northeastern University, USA
<r.toncelli
northeastern.edu>
Ilka Kostka
Northeastern University, USA
<i.kostka
northeastern.edu>
Abstract
This article explores the integration of generative AI tools (GenAI) into a writing activity that helps second language (L2) writers critically evaluate AI-generated texts and build their confidence in writing. The activity was implemented in a graduate-level English language course at a university in the United States. In class, students wrote summaries of academic articles, prompted GenAI tools for writing feedback, and then analyzed the AI output to identify suggestions they liked and disliked. Next, students reviewed each other’s AI-generated revisions, which revealed a similar style in terms of word choice and sentence structure across summaries. An analysis of student reflections and survey data demonstrates that students valued GenAI tools for improving grammatical accuracy and academic vocabulary though they disliked how it changed their intended meaning and diminished their individual voice. We share lessons we learned to inform future iterations of similar in-class writing activities and discuss the importance of embracing opportunities to innovate writing instruction with GenAI, the need to create supportive environments for explorations of AI, the value of collaborating with students and fellow educators, and staying open to the potential benefits of GenAI integration in writing instruction.
Keywords: GenAI tools, L2 writing, multilingual writing
Encouraging students to feel confident in their individual writing styles is a critical part of teaching writing, yet students’ use of generative AI (GenAI) can complicate traditional instructional practices. For instance, we have noticed that many of our multilingual students lacked confidence in using academic English before GenAI became popular, but today, there seem to be even more ways that using GenAI tools could discourage them from developing a sense of a unique voice. We often hear students say they use Grammarly and ChatGPT to “fix” their writing and help them “sound better,” and many students have told us that GenAI tools write better than they do. We have also observed that advanced writers seem to trust their own voice over AI, while emerging writers trust AI over their own voice. Sandstead and Kibler (2025) state that it is hard to deny how GenAI tools have considerably impacted what individual voice and identity in writing mean, which has implications for how we teach writing. For these reasons, writing instructors will need to develop new ways to teach writing as GenAI continues advancing.
Students’ perspectives and experiences with these tools, as well as conversations with colleagues about teaching writing, inspired us to experiment with activities in class that integrate GenAI into instruction. As Giray et al. (2025) suggest, educators should focus on teaching students about academic writing rather than focusing on AI detection. In alignment with these recommendations, we were interested in looking at how GenAI can be leveraged to support but not replace learning, especially if we “provide students with comprehensive training and guidance on how to use AI ethically, responsibly, and effectively” (Giray et al., 2025, p. 95). We wanted to develop and implement a pedagogical approach that incorporates GenAI into writing and class discussions and enhances students’ confidence in academic writing.
In this paper, we describe an activity we carried out in an English language research and writing class for graduate students. We begin by highlighting key insights from the literature on GenAI use in academic writing. We then describe an example of an in-class activity that integrates GenAI into an existing scaffolded writing assignment. Using student work samples and survey responses, we shed light on students’ perspectives of GenAI tools’ benefits and limitations, as well as their observations of AI-generated texts. Although this activity focuses on the use of AI in academic writing development, we also aimed to challenge students’ assumptions that AI-generated writing is “better” than their writing and that AI-generated output is always accurate. For these reasons, an important element of this activity involves developing students’ critical AI literacy skills, empowering them to analyze AI-generated writing, and building awareness of its affordances for learning how to write in English.
Review of Literature
Recent research shows that second language (L2) writers are using GenAI to support their academic writing processes (Moorhouse et al., 2025). The authors’ study contributes to a growing body of literature that examines how both instructors and multilingual writers use generative AI. Overall, much of this work advocates for integrating AI tools into second language (L2) writing instruction but emphasizes ethical use, critical thinking about AI output, and engagement with GenAI tools that balance human agency with its affordances (e.g., Giray et al., 2025; Zaheer et al., 2025). Empirical research and pedagogically-oriented scholarship have also examined both the opportunities and concerns inherent to GenAI use for writing purposes (e.g., Hwang et al., 2025; Nañola et al., 2025; Su et al., 2023; Yan, 2023). In this section, we offer an overview of emerging themes in current scholarship that center on the use of generative AI in second language writing instruction.
Affordances of GenAI for Teaching Writing
Some empirical work has illuminated how multilingual writers are using generative AI tools for support during the writing process. These uses include improving grammatical accuracy and sentence variety, identifying vocabulary words with translation tools, and comparing first drafts to ones translated by machines (Godwin-Jones, 2022), as well as using AI translation tools for writing support (Crompton et al., 2024) and linguistic support (Moorhouse et al., 2025). Zaheer et al. (2025) conducted a systematic literature review to explore how university-level English as a Foreign Language (EFL) and English as a Second Language (ESL) students and faculty use GenAI tools to support writing. They found that students are using tools that provide automated writing evaluation (AWE), automated writing feedback (AWF), and paraphrasing, and hybrid tools. In another study, Hwang et al. (2025) examined how L2 writers use AI during the writing process and found they use ChatGPT primarily for generating ideas for writing, a finding that aligns with a study conducted by Jiang et al. (2024). Hwang et al. (2025) also found that writers perceived ChatGPT to be useful for writing tasks (e.g., grammar support, creating ideas). Their study aligns with others who have found that second language writers seem to have mostly positive attitudes towards using GenAI in writing (e.g., Toncelli et al., 2025; Wang, 2024; Zaheer et al., 2025).
Other work has shown how multilingual writers use GenAI to receive feedback on their writing. For instance, in a study of doctoral student writers, many of whom were multilingual, Khuder (2025) found that students sought feedback on argumentation, research positioning, and revisions; in addition, the use of GenAI tools for personalized feedback helped them refine their understanding of writing norms. Similarly, Abduljawad (2025) conducted a study about the impact and experience of AI feedback on English-language writing at a university in Saudi Arabia. The author found that students appreciated receiving personalized feedback in real time, which led to improvements in language and mechanics in their writing. While this research demonstrates that multilingual writers are actively leveraging GenAI to enhance their writing in myriad ways, their application of GenAI also warrants the need to consider its challenges and limitations.
Considerations
The emergence of generative AI tools has introduced unique considerations for teaching writing. As Chapelle (2025) points out, no previous technologies “produced novel, contingent grammatical sentences in connected discourse in real-time conversation with the user” (p. 2). This technological shift complicates the process of developing academic skills and poses challenges for L2 writers in particular, as they are learning both academic writing and academic English (Zaheer et al., 2025). This challenge may be further aggravated for lower proficiency students who may trust AI output more fully than higher proficiency students (Sandstead & Kibler, 2025). Indeed, when sophisticated text-generation tools evolve past basic spelling and syntax support to provide polished translations, revisions, or complete composition of lengthy passages, language instructors may be concerned about authenticity and learning in student work (Godwin-Jones, 2022).
These tools have raised other concerns about teaching and learning. For instance, academic integrity is one of educators’ (Crompton & Burke, 2024; Moorhouse et al., 2023; Toncelli & Kostka, 2024; Ulla et al., 2023) and students’ primary worries (Lund et al., 2025). In response, AI detection tools have been adopted by schools and universities to uphold academic integrity and discourage students from excessive AI use. However, these tools are generally perceived to be problematic (e.g., Coffey, 2024) and biased against multilingual writers (Liang et al., 2023). Additionally, Sandstead and Kibler (2025, p. 6) discuss the ethical implications of using GenAI for teaching writing due to the lack of attribution of sources and the possibility of GenAI replacing students’ own thinking. These concerns have been addressed by scholars who suggest that focusing on teaching students to use AI tools is a better way forward than focusing on punishment and policing (Giray et al., 2025; Liu & Bates, 2025). As Giray et al. (2025) state, “AI tools should not be viewed as policing mechanisms but as educational tools that can enhance the learning and writing process” (p. 94). Thus, a pedagogical shift towards integration of GenAI may provide a useful way forward in writing instruction.
Another issue that AI creates is the potential for overreliance on AI which can prevent students from thinking critically (Alm & Ohashi, 2024; Hwang et al., 2025; Moorhouse, 2024; Wang, 2024). As Wang (2024) cautions, “Reliance on AI could lead to superficial engagement with writing tasks, discouraging deeper, reflective thought processes essential for original creative expression” (p. 15). Of note, many scholars have pointed out that students share educators’ concerns about the overuse of GenAI reducing their critical thinking (Moorhouse et al., 2025; Toncelli et al., 2025; Wang, 2024; Yan, 2023), undermining their writing voice and identity (Wang, 2024), and leading to a potential loss of individuality and creativity (Abduljawad, 2025). To address these challenges, writing teachers should carefully integrate GenAI tools in ways that preserve the critical thinking and creative benefits of writing. Engaging students in the writing process and helping them value learning can in turn help motivate them to do their own work (Bowen & Watson, 2024). Lastly, instructors must ensure that they know how these tools work so they can integrate them into teaching in ethical and pedagogically-sound ways (Pack & Maloney, 2024).
Activity Rationale
Ouyang and Jiao (2021) identified three approaches for the use of AI to enhance learning: AI-directed learning, in which students follow learning sequences that are determined by the AI; AI-supported learning, in which students and AI are collaborators and the AI tool offers support; and AI-empowered learning, in which students are leaders in “integrating advanced AI techniques and human decision-making” (p. 4). Ouyang and Jiao suggest that AI-empowered writing should be educators’ objective, which is the approach that informed the design of our activity as we wanted to encourage students to make decisions about GenAI feedback. Similarly, Sandstead and Kibler (2025) encourage the use of AI-empowered writing practices in which students solicit AI feedback, revisions, and edits on their writing, and then evaluate these suggestions to recognize that AI-generated alternatives may not improve their original work. Indeed, one of the goals in this project was to guide students through using GenAI to support but not replace their writing and thinking about their topics. We also wanted to develop an activity that added a GenAI component to an existing assignment to boost students’ critical AI literacy skills (Bowen & Watson, 2024) and allow them explore “responsible writing practices” in which AI use is transparent and ethical (Giray et al., 2025; p. 106). In this way, we can build upon assignments we have given students and help them see why AI literacy skills matter (Bowen & Watson, 2024).
Teaching Setting
This activity was designed by both authors and carried out by one author in a graduate-level research and writing course in an intensive English language program at a U.S. university. In this program, students strengthen their academic English and intercultural communication skills while simultaneously enrolled in master’s-level courses in their degree programs, such as business administration, marketing, data analytics, and law. This research and writing course included ten international students from China, Taiwan, Thailand, Vietnam, and Azerbaijan who have either intermediate or advanced English language proficiency. We had access to reliable high-speed internet, a classroom computer and projector, and laptops for each student. In areas with low Wi-Fi bandwidth or limited access to technology, this activity could be adjusted by having students work in small groups to share devices, using instructor demonstrations on a single device, or pre-downloading GenAI outputs for offline critical analysis. Free versions of AI tools can also be used, so students do not need to pay for access.
To reach an agreement about the role of GenAI in the course, the instructor and students created guidelines for appropriate AI use at the beginning of the semester. Co-creating guidelines with students can clarify expectations for appropriate and inappropriate use and foster a pedagogical rather than a punitive approach to GenAI integration (Bowen & Watson, 2024; Kostka et al., 2025). For example, the guidelines that were negotiated between the instructor and students in this course allowed students to use GenAI tools to improve grammatical accuracy and sentence structure (e.g., clarity checks on final checks) but did not permit students to produce fully AI-composed writing. The class discussed appropriate AI use in other in-class activities so students would learn when AI use would either help or hinder their learning. The combination of a negotiated framework for AI use and a variety of activities throughout the course provided a foundation for students to feel comfortable reflecting on AI tools.
Our pedagogical approach targets three key objectives: 1) summarize an academic article; 2) compare and contrast AI writing with their own writing; and 3) critically evaluate AI output. As scholars have noted, critically analyzing AI-generated writing in class can foster AI literacy skills and help students draw boundaries between AI as a supportive tool and its overuse (e.g., Bozkurt et al., 2024). Here, students were encouraged to both agree and disagree with AI-generated versions of their own writing to raise their awareness that AI output is not necessarily better than their writing. We also aimed to create a classroom culture where we explore how GenAI might enhance but not replace writers’ engagement with ideas. We received approval from the university’s institutional review board to carry out this activity, and students provided written consent to share their perspectives and work.
Description of the Activity
Throughout the semester, students engaged in a multi-stage project which culminated in a research paper and presentation on a topic they chose within their field of study. The project aimed to develop students’ ability to take a research-informed position on a current issue in their professional field; this included identifying, summarizing, and eventually synthesizing credible sources. As part of a larger effort to support students in their writing development, the teaching practice described here involved students writing a summary about an academic article that they had read and annotated as a homework assignment. The activity aimed to engage students in exploring how AI could be a supportive writing tool while also strengthening their ability to evaluate AI-generated content, particularly to ensure that AI suggestions aligned with their intended meaning and style. Here, we are reminded that developing process-based assignments that invite students to explain their work can encourage critical thinking and offer a valuable opportunity for learning in an AI age (Bowen & Watson, 2024).
Before the start of this class, the instructor created a shared Google Doc that all students could access; the Doc included a table with six sections for each student to complete during the in-class activity. As outlined in Figure 1, students first completed an in-class summary, prompted GenAI for corrections in grammar and mechanics, and asked for writing tips. Next, students analyzed the GenAI output to reflect on feedback they liked and disliked and then explained their rationale (Jiang et al., 2024). The final part of the activity involved students reading each other’s GenAI-revised summaries in preparation for discussion and written reflection on their observations.

Figure 1. Overview of the In-Class Summary Activity
Table 1 shows how the Google Doc was formatted according to six sequential steps. Before class, each student was assigned to read and annotate one article for their research projects. Once in class, students completed a 20-minute writing assignment in Part 1 of the Google Doc. When the summaries were completed, the instructor entered them into Copilot for suggested improvements. The instructor also reminded students that they should always be cautious about what information they input into AI. Because the instructor had access to an enterprise Copilot account, students opted to use her account so their work would not be collected for AI training. The AI-generated output was then copied into Part 2 of the shared Google Doc. Students read and evaluated the AI-generated corrections and writing tips for their own summaries and made general observations in Part 3. Next, students completed Parts 4 and 5 in which they highlighted the AI-generated text they liked in italics and text they disliked in bold, respectively. Students were asked to explain their rationale for any highlighted text, and they were reminded that they could like or dislike AI-generated text for any reason. As a class, students skimmed all of the AI-generated summaries in the Google Doc, shared their observations in a whole-class discussion, and then reflected on them in Part 6.
Table 1. Format of the shared Google Doc for in-class writing and analysis
| Your name: Type your name here | |
| Part 1. Write a four-to-five sentence summary of the article you will be presenting next week. | |
| Part 2. Upload your summary to ChatGPT or Copilot* and then ask for advice and/or corrections to improve your summary. | |
| Part 3. What do you notice about the differences in language? | |
| Part 4. Go back to number 2 and highlight the words/phrases that you like in italics. Explain your thinking here. | |
| Part 5. Go back to number 2 and highlight the words/phrases that you dislike in bold. Explain your thinking. | |
| Part 6. Present your responses to the class. Afterwards, reflect on what you notice after looking at many AI responses. | |
We took several measures to create a supportive learning environment in which the class could openly discuss GenAI use. First, we assured students that all ideas about GenAI were welcome. Furthermore, we tried to frame the activity as an exploration of writing, which allowed us to identify the strengths and limitations of AI tools. To make this exploration clear to students, we emphasized that they could disagree with language in the AI-generated output for any reason. In this way, we reduced prescriptive notions that grammatical accuracy was the main determinant in academic writing and gave students agency over what language best conveyed their meaning. Lastly, we offered students multiple opportunities to engage in the activity through reading, writing, talking with each other, and sharing ideas with the whole class, which allowed them to participate in the activity in ways that fit their learning preferences.
As with any activity that involves GenAI, we also took special care to protect students’ intellectual property. Since we began experimenting with AI in class, we have purposefully used tools that protect student privacy (e.g., Microsoft Copilot). At the time of this activity, the university did not provide institutional access to any AI tools that do not store students’ data, and only the instructor had access to a protected account for Copilot. For this reason, the instructor offered use of her account during the activity. After the discussion, all students opted to continue using the instructor’s account. While this workaround helped us, obtaining institutional access to accounts that protect student data would allow students to use GenAI more efficiently and safely. Nevertheless, instructors who do not have access to protected accounts will need to explore other ways to use GenAI safely and responsibly with their students, such as doing a teacher-led demonstration with prompts generated by students. Lastly, we had been acquainted with our student population for several weeks at the time this activity was conducted and observed that they did not seem resistant to GenAI use. Nonetheless, we aimed to minimize coercion to participate in this activity by collecting informed consent and explicitly stating that this ungraded voluntary activity would not affect course performance. While this activity was part of a regular class session, students were able to opt out of sharing their written work for our analysis of this project.
Table 2 shows an example of a completed Google Doc of a student in the class (Henry, pseudonym). In Part 1, the student drafted an in-class summary of a reading about marketing strategy considerations for Generation Z consumers. In this first draft, we can see the student’s effort to use logical transitions (e.g., however, additionally), as well as in-text APA citations. These transitions had been areas of focus in the class, so it was encouraging to see the student practicing these writing skills. In Part 2, we can see suggestions from Copilot to check the in-text citations, include more specific details, and organize the flow of ideas from more general information to more specific information. Of note, in Part 3, the student appreciated the “more natural” and more direct writing style of the summary revisions from Copilot. In Part 4, the identified phrases from the GenAI output that felt more “professional” to him. Importantly, in Part 5, the student noted AI-generated output did not quite capture his feelings. Taken together, Parts 4 and 5 show the student reflecting on language suggestions from Copilot that seem to be improvements to him while also staying true to his initial intentions when writing. In conclusion, in Part 6, after reviewing his classmates’ AI-generated summary revisions, we can see the student recognizing similarity in AI voice across summaries, which he reported sounded similar despite being originally drafted by different students on different topics. In recognizing that use of GenAI to revise writing runs the risk of losing some of each person’s uniqueness, the student appears to show a growing understanding of the importance of writing as a reflection of an individual writer’s voice.
Table 2. Sample student work showing GenAI integration into writing
| Your name: Henry (pseudonym) | |
| Part 1. Write a four-to-five sentence summary of the article you will be presenting next week. | Nadanyiova & Sujanska (2023) researched about the impact of influencer marketing on the decision, focusing on generationZ, which is a big target using the internet and the generation that have never known the time without the internet. However, it can be seen that how people use the internet has changed over a decade (Nadanyiova & Sujanska, 2023). Nadanyiova & Sujanska (2023) also mentioned interesting information that customers these days tend to believe influencers more than well-known celebrities. This is because the customers can access the influencer’s private life and they have similar things that they can relate to influencers.Additionally, Nadanyiova & Sujanska (2023) provided a specific platform of social media, specific numbers, making it very clear and useful for markers and those who are interested in this information. |
| Part 2. Upload your summary to ChatGPT or Copilot and then ask for advice and/or corrections to improve your summary. | Nadanyiova & Sujanska (2023) researched the impact of influencer marketing on decision-making, focusing on Generation Z, a demographic that has grown up with the internet. Their study highlights how internet usage has evolved over the past decade. They found that today’s consumers tend to trust influencers more than well-known celebrities because they can relate to influencers’ private lives and shared experiences. Additionally, Nadanyiova & Sujanska (2023) provided specific social media platforms and detailed statistics, making their findings particularly clear and useful for marketers and those interested in this information. Writing Tips: Citations: Ensure proper citation of sources to maintain credibility. Details: 1. Include specific details about the social media platforms and statistics mentioned. 2. Flow: Maintain a logical flow of ideas, moving from general impact of influencer marketing to specific findings and implications. |
| Part 3. Please answer the following question: What do you notice about the differences in language? | The language seems more natural.It combined some sentences that I was trying to explain together, so it might be good in terms of being direct to the point. |
| Part 4. Go back to number 2 and note the AI-generated words/phrases that you like in italics. Explain your thinking here. | today’s consumers tend to trust influencers more than well-known celebrities : I like the overall sentence compared with my old one (These days =Today’s; believe = trust). particularly clear > sound a professional word. |
| Part 5. Go back to number 2 and note the AI-generated words/phrases that you dislike in bold. Explain your thinking here. | a demographic that has grown up with the internet. I don’t like it because I feel my sentences can explain what I want to say more than this sentence, so I don’t feel it from my writing and my feelings. |
| Part 6. Present your responses to the class. Afterwards, reflect on what you notice after looking at many AI-generated responses. | Everyone sounds the same.We lose some words that can identify ourselves and our uniqueness. |
Project Reflections
Overall, this activity aligns with our experiences teaching writing, as we have observed that students generally tend to be more focused on accuracy and formal academic language than global writing issues such as content and organization. When reflecting on students’ written work, their survey responses, and our impressions after implementing the activity, we believe we met the objectives we set out to achieve. First, we did not directly measure students’ confidence in their writing style, but observed that students valued their individual expression and intended meaning. As one student noted, “Although AI can improve student [communication] by changing your words or structure, students need to be careful it also changes the meaning.” We also believe the activity was an effective means of developing students’ ability to critically evaluate AI output as they made judgments about AI-generated language, reinforcing the notion students must question and verify AI output. Lastly, we wanted to take a pedagogical rather than a punitive approach to using GenAI and focus more on learning than on preventing cheating. By including GenAI as a tool within the writing process, we opened the door to conversations about its strengths and limitations for learning and permitted students to disclose their use. We describe other reflections on the project below.
To gather students’ perspectives, we gave them an anonymous survey that included multiple-choice and open response questions about both the activity and the benefits and challenges of GenAI use. Students reported that the main benefit was that GenAI helped them improve their vocabulary and grammar, and they liked how they could use these tools as a “grammar checker.” In Table 2, Henry wrote that he liked “the overall sentence compared with [his] old one (i.e., These days = Today’s; believe = trust).” Other students’ comments were about the formality of language; one student wrote that an affordance of AI was “using academic words in order to make my writings formal.” Another student commented that “with this activity, I can see how our voice is unique though there are some grammar mistakes or the flow doesn’t very smooth, but that all better than the same structures from AI.” These comments suggest that students began to understand how AI could enhance their writing skills without replacing the unique perspectives and voice that characterize their individual writing style.
As for limitations of GenAI, we were pleased to find that students were concerned that AI could change their original meaning. As one student said, “I have to check because AI cannot be trusted 100%.” One student wanted to still retain their voice and said “sometimes AI may change the meaning of my writing, giving me too complex words.” In the Google Doc, we were also pleased to see additional comments about preserving voice. As one student noted (Table 2), “I don’t like it because I feel my sentences can explain what I want to say more than this sentence, so I don’t feel it from my writing and my feelings.” In a study of doctoral students learning to use GenAI to support their academic writing, Ou et al. (2024) found that students “recognised the importance of human oversight on GAI outputs and retaining ownership of their texts” (p. 12). Ultimately, we felt that students gained a more nuanced understanding of benefits and risks of using GenAI in their writing processes because their survey responses articulated the tension between use and overuse.
In response to the question, “How effective was today’s activity for helping you think about the difference between ‘AI voice’ and your individual voice in writing?” 100% of students said it was either “effective” or “very effective.” Their written answers shed light on why they may have responded in this way. First, many students wrote that they noticed how similar AI-generated output was among students’ summaries. One student stated that “AI voice gave similar advice to everyone. For example, it always has some words and phrases that AI gives.” Because students shared their AI-generated summaries and read several examples that their peers generated, they were able to quickly spot similarities in language across multiple texts. Another student wrote that “the sentence structures of AI voice look so similar. With this activity, I can see how our voice is unique though there are some grammar mistakes or the flow doesn’t very smooth, but that all better than the same structures from AI.” This echoes similar findings by Abduljawad (2025) whose study of 130 university-level ESL students in Saudi Arabia utilizing AI-tools to support their writing development were concerned that it could lead to a “reduction in individuality and creativity” (p. 13). First and second language writers in another study recognized that without critical engagement with AI output, they risked homogenizing their writing and diminishing their individuality (Wang, 2024). We believe a tremendous benefit of this activity is that it helped them look closely at language use, vocabulary, and writing style, which built both their critical thinking skills and awareness of how AI generates output. They also seemed to understand that AI-generated writing may be formal but not necessarily sound better than their voice.
Interestingly, when asked whether this activity enhanced or changed their thinking about AI use and academic writing, 100% of students said it did. Based on in-class discussion, we believe this strong shift can be attributed to two factors. First, students had “permission” to dislike aspects of AI-generated language, which they may have previously assumed was perfect. Second, they had time to review several AI-generated summaries across a range of topics, which revealed a common voice and writing style. In their written comments, they gave additional insights about how their thinking changed. One student noted, “I learnt more about AI, that is the benefits and disadvantages. I also learnt that relying on AI is not helpful because it will not help me improve my writing skills.” A similar comment echoed the idea of avoiding overuse; the student noted “After I joined this activity, I think that it can be better to use AI to check your grammar but you have to recheck it again. These comments align with Wang’s (2024) study of first-year writing students who identified both benefits and limitations to using ChatGPT in writing and noted that relying too much on AI tools may lead to learning loss. Another student wrote that “I know when do I need to use AI or not,” which shows the student’s agency in making decisions about AI use. We hope that this activity planted seeds about the importance of critically analyzing output, using human judgment, and deciding whether use of AI is a helpful tool to enhance learning and writing. Table 3 summarizes the common themes we found in students’ reflections.
Table 3. Summary of students’ survey responses
| Common Themes | Student Perspectives |
| Usefulness of GenAI tools | ● “Learning, words, phrases, transitions, and feedbacks.” ●”It provides me better phrases, words, and corrects the grammar of sentences.” ●”Using academic words in order to make my writings formal.” |
| Concerns about changes to intended meaning | ● “I have to check because AI cannot be trusted 100%.” ● “Sometimes AI may change the meaning of my writing, giving me too complex words.” ● “AI can change the meaning of terms as compared to what was initially placed.” |
| Similarities in AI-generated language across summaries | ● “AI voice gave similar advice to everyone. For example, it always has some words and phrases that AI gives.” ● “The sentence structures of AI voice look so similar. With this activity, I can see how our voice is unique though there are some grammar mistakes or the flow doesn’t very smooth, but that all better than the same structures from AI.” ● “[AI voice] is very professional but it seems like create by the same person. Sometimes, it’s too formal to us, not like a human thinking.” |
| Necessity for critical use of GenAI tools | ● “I learnt more about AI, that is the benefits and disadvantages. I also learnt that relying on AI is not helpful because it will not help me improve my writing skills.” ● “After I joined this activity, I think that it can be a better to use AI to check your grammar but you have to recheck it again.” ● “Sometimes we still need some informations or knowledge from AI. However, it just can be a tool to help us not replace us.” |
| Value of activity for developing confidence in individual voice and AI literacy | ● “I think we can see it clearly how different between using AI and my own voice.” ● “I know when do I need to use AI or not” ● “I think my voice in writing is unique though there might be some mistakes, it still look like what I am thinking. I need to improve my English skills of writing while I still can not lose my own voice and thoughts.” |
Lessons Learned and Future Iterations
Overall, we believe this activity fostered collaboration among students and helped build a community of learners that included both the instructors and students. Students were invited to share their views about GenAI’s benefits and limitations, which helped minimize a top-down approach between the instructors and students and allowed for more equal collaboration within the course. As educators, we were able to co-develop an activity and try it out in class to explore what teaching writing with GenAI looks like. We echo Zaheer et al.’s (2025) advocacy for “guided engagement” (p. 19) in which students are taught to think critically about AI tools and learn how to balance their affordances and limitations. The activity also allowed us to guide students in critically analyzing AI-generated output, which is a skill students will need in the future (United Nations Educational, Scientific, and Cultural Organization, 2024). Students appeared to be highly motivated to compare their writing to AI-generated versions, and recognizing the linguistic similarities across all of the AI-generated summaries required them to engage in a close reading of all texts. As we reflect on this activity, we identify four lessons learned that may inform future iterations of this project.
1. Embrace innovation and student perspectives
We can think about innovation as “fresh ways of meeting outstanding challenges in a spirit of openness to disciplined experimentation” (Organisation for Economic Co-operation and Development, 2017, p. 17). One way to embrace innovation is to reframe challenges as learning opportunities. Indeed, many in-class activities, such as the one we describe in this article, have come from challenges and problems that we have faced in class. For instance, when we began questioning why some students occasionally submit work that appears to rely too heavily on AI despite ongoing discussions about responsible AI use, we realized we needed to better understand students’ actual experiences with these tools. The activity we describe here was inspired by our curiosity about how to integrate GenAI into teaching writing. We were encouraged to ask questions, better understand students’ approaches to writing with GenAI tools, and offer opportunities for them to explain their work. In this way, we approach challenges with a spirit of inquiry rather than a punitive mindset, and we are in a better position to learn and develop new pedagogical strategies. There are many unanswered questions related to GenAI, yet a good way forward is to remain responsive to students’ evolving needs and adapt accordingly.
In this activity, we also saw tremendous value in soliciting students’ feedback. As Chan and Hu (2023) note, “Understanding students on their willingness and concerns regarding the use of GenAI tools can help educators to better integrate these technologies into the learning process, ensuring they complement and enhance traditional teaching methods” (p. 14). Collecting their perspectives on new teaching approaches can also familiarize instructors with students’ needs and identify recurring issues to address (e.g., questions about how to disclose AI use, technical difficulties). Nonetheless, collecting student feedback does not have to be time-consuming; instructors may use weekly exit tickets, include reflections that are part of course assignments, and observe in-class discussions. These techniques can be seamlessly woven into courses. However instructors prefer, involving students in practice is key to understanding how they are thinking about and using GenAI tools.
2. Create a community among students and educators
As Chiu et al. (2023) suggest, “teachers can develop warm, caring, and positive learning environments by providing personal praises and comments to students, eliciting and valuing their feedback, and facilitating collaborative activities” (p. 3). We were reminded of how crucial it is to create a supportive environment to carry out activities related to AI in particular, as this topic may be sensitive or anxiety-inducing. Because many students are often afraid of being wrongly accused of academic misconduct (Luo, 2024), building a community of learners was necessary in order to use AI for student work. Within this community, establishing “reciprocal trust” (Giray et al., 2025) between the students and the instructor provided a foundation for learning. It was also important to create a supportive class environment because instructors in this language program have different policies about AI use, ranging from full bans to regular integration of AI for learning support. We needed to be mindful of the messages that students received in other courses about GenAI use, as thes messages could have impacted their engagement in the activity described here. As such, we believe that regular use and discussion of GenAI tools in class is a better way to build trusting relationships with students rather than focusing on detection and punitive measures (Luo, 2024).
3. Explore GenAI with students and fellow educators
Educators play a vital role in guiding students’ use of GenAI and preparing them to study, work, and live in an AI-enhanced world (Moorhouse, 2024; U.S. Department of Education Office of Educational Technology, 2023). While they should improve and develop their AI literacy skills (Kostka & Toncelli, 2023), we have found that much learning about GenAI can occur through in-class experimentation and dialogue with students (Kostka & Toncelli, 2025). In this activity, educator expertise was not necessarily required as the class was able to explore the strengths and limitations of GenAI together. The class was framed as a community in which all experiences with AI were valued, and the instructor was not the sole keeper of knowledge. In this way, we can view “students as partners” (Liu & Bates, 2025, p. 6) and better understand their sense of GenAI’s affordances and ethical concerns (Chan & Hu, 2023). This kind of collaboration also positions instructors to be responsive to students’ learning and AI literacy in their education and beyond.
Collaboration with peers is another valuable means of learning about GenAI integration into teaching. Developing this activity, and several others that we have implemented in class, involved co-planning and a shared iterative process of analysis and reflection. As we found in a previous study (Toncelli & Kostka, 2024), working closely with fellow educators for open exchange can greatly enhance supportive professional learning. As GenAI continues changing, collaboration of this nature takes on increasing importance, as does the value of sharing new pedagogical practices through scholarship, workshops, presentations, and discussions. Furthermore, engaging with fellow educators can provide a form of ground-up, low-cost, and accessible professional learning. Instructors can work together to co-plan lessons and activities, conduct classroom experiments, visit each other’s classrooms to observe teaching, and reflect on what is working well in their classrooms, all of which does not involve much time away from teaching or funding to attend professional events. This project allowed us to not only co-design the activity but also to look at student work together and reflect on the effectiveness of the activity and the broader implications of GenAI integration into writing instruction.
4. Remain open to integrating AI into the writing process when appropriate
As Wang (2024) notes, “when used properly, AI has the potential to introduce a new avenue for humanizing writing and education” (p. 16). We began this project questioning whether GenAI had any productive place within the writing process, and now we believe there is great potential for GenAI in writing instruction. From the survey data, we learned that students will likely continue to lean on AI tools for writing support throughout their academic studies. As 80% of students stated, they “prefer writing with AI support” instead of writing on their own with no AI help. This activity confirms for us that GenAI tools are becoming an integral part of students’ academic writing process (Moorhouse et al., 2025). It also aligns with scholarship showing that students view AI as a tool that can offer linguistic support during the writing process (e.g., Jiang et al., 2024). As such, we plan to continue building even more low-stakes drafting opportunities in writing activities that include critically analyzing AI-generated output. We will also continue thinking about ways to de-emphasize grammar and focus more on ideas and organization; that may help students resist the urge to outsource their writing and thinking to GenAI.
In-class discussions with students during the activity also revealed that the GenAI support helped them feel confident about sounding more professional and fluent, a finding which aligns with Zaheer et al. (2025), who noted AI writing support impacted students’ “affective experiences, particularly by boosting confidence and reducing writing anxiety” (p.15). Similarly, in a study by Kim et al. (2024), university-level students stated that the use of AI tools enhanced the entire writing process, from ideation to final revisions, while also improving both the written product and their experience of writing. However, as Hwang et al. (2025) caution, students must “recognize that generative AI, despite its benefits, can be flawed and unreliable” (p. 12). Because GenAI tools can both empower and mislead students, they will need to continue practicing critically analyzing AI-generated writing (Bozkurt et al., 2024) and exploring the line between appropriate and inappropriate GenAI use in future academic writing tasks. A major implication for writing instructors is to explore productive uses of GenAI and support students as they learn to work with these tools rather than against them.
Conclusion
As Chapelle (2025) notes, “The question is how to help students use GenAI tools without losing their opportunity to learn how to write” (p. 4). This project motivated us to think about how we will continue to address GenAI in writing instruction and build on the solid practices we have always used to teach writing, such as scaffolding assignments and peer review. We are also reminded that good teaching, rather than cutting-edge technology, should remain at the core of our work. We echo Bowen and Watson’s (2024) assertion that “good pedagogy should always be our first consideration. Combining high standards with high care, building trust and community, focusing on equity and inclusion, increasing motivation, and creating better, clearer, and more relevant assignments can both increase learning and reduce cheating” (p.129). As we move forward, we will all need to continue reflecting on how GenAI integration is influenced by other elements of teaching, such as students’ proficiency levels, their motivation to learn, access to AI tools, and assignment and rubric design. There is much work to do, but collaborating with students and other educators will position us to meet this current moment with fresh ideas for innovating teaching together.
Acknowledgements
We would like to thank Dr. Cristine McMartin-Miller for her thought leadership in AI and second language writing and Dr. Michelle Kassorla for her insightful support and valuable feedback on this article.
About the Authors
Rachel Toncelli, EdD is Associate Director of the Center for Advancing Teaching and Learning Through Research at Northeastern University in Boston, Massachusetts, United States. She has taught a range of English language courses and worked as an ELL program director and TESOL teacher educator. ORCID ID: 0000-0001-5977-990X
Ilka Kostka, PhD is Teaching Professor and Academic Director of the NU Immerse and Global Pathways Programs in the College of Professional Studies at Northeastern University in Boston, Massachusetts, United States. She teaches English language courses to international students and oversees academic and faculty affairs. With Rachel Toncelli, she is the co-recipient of the 2024 Ron Chang Lee Award for Excellence in Classroom Technology, given by TESOL International Association. ORCID ID: 0000-0001-8920-3178
To Cite this Article
Toncelli, R. & Kostka, I. (2025). “Our voice is unique:” Integrating generative AI into multilingual writing instruction. Teaching English as a Second Language Electronic Journal (TESL-EJ), 29(3). https://doi.org/10.55593/ej.29115int
References
Abduljawad, S. (2025). Investigating the impact of ChatGPT as an AI tool on ESL writing. International Journal of Computer-Assisted Language Learning and Teaching, 14(1), 1-19. https://doi.org/10.4018/ijcallt.367276
Alm, A. & Ohashi, L. (2024). A worldwide study on language educators’ initial response to ChatGPT. Technology in Language Teaching & Learning, 6(1), 1-23. https://doi.org/10.29140/tltl.v6n1.1141
Bowen, J.A., & Watson, C.E. (2024). Teaching with AI: A practical guide to a new era of human learning. Johns Hopkins University Press.
Bozkurt, A., Xiao, J., Farrow, R., Bai, J. Y. H., Nerantzi, C., Moore, S., Dron, J., Stracke, C. M., Singh, L., Crompton, H., Koutropoulos, A., Terentev, E., Pazurek, A., Nichols, M., Sidorkin, A. M., Costello, E., Watson, S., Mulligan, D., Honeychurch, … & Asino, T. I. (2024). The manifesto for teaching and learning in a time of generative AI: A critical collective stance to better navigate the future. Open Praxis, 16(4), 487–513. https://doi.org/10.55982/openpraxis.16.4.777
Chan, C. K. Y., & Hu, W. (2023). Students’ voices on generative AI: Perceptions, benefits, and challenges in higher education. International Journal of Educational Technology in Higher Education, 20(1), Article 43. https://doi.org/10.1186/s41239-023-00411-8
Chapelle, C. A. (2025). Generative AI as game changer: Implications for language education. System (Linköping), 132, Article 103672. https://doi.org/10.1016/j.system.2025.103672
Chiu, T. K. F., Moorhouse, B. L., Chai, C. S., & Ismailov, M. (2023). Teacher support and student motivation to learn with Artificial Intelligence (AI) based chatbot. Interactive Learning Environments, 2(7), 3240-3256. https://doi.org/10.1080/10494820.2023.2172044
Coffey, L. (2024, February 9). Professors proceed with caution using AI-detection Tools. Inside Higher Ed. https://www.insidehighered.com/news/tech-innovation/artificial-intelligence/2024/02/09/professors-proceed-caution-using-ai#
Crompton, H., & Burke, D. (2024). The educational affordances and challenges of ChatGPT: State of the field. Tech Trends, 68, 380–392. https://doi.org/10.1007/s11528-024-00939-0
Crompton, H., Edmett, A., Ichaporia, N., & Burke, D. (2024). AI and English language teaching: Affordances and challenges. British Journal of Educational Technology, 00, 1-27. https://doi.org/10.1111/bjet.13460
Godwin-Jones, R. (2022). Partnering with AI: Intelligent writing assistance and instructed language learning. Language Learning & Technology, 26(2), 5-24. http://doi.org/10125/73474
Giray, L., Sevnarayan, K., & Ranjbaran Madiseh, F. (2025). Beyond policing: AI writing detection tools, trust, academic integrity, and their implications for college writing. Internet Reference Services Quarterly, 29(1), 83–116. https://doi.org/10.1080/10875301.2024.2437174
Hwang, H., Chang, X., & Sun, J. (2025). Generative AI is useful for second language writing, but when, why, and for how long do learners use it? Journal of Second Language Writing, 69, 1-14. https://doi.org/10.1016/j.jslw.2025.101230
Jiang, J., Vetter, M.A., & Lucia, B. (2024). Toward a ‘more-than’digital’ AI literacy: Reimagining agency and authorship in the postdigital era with ChatGPT. Postdigital Science and Education, 6, 922-939. https://doi.org/10.1007/s42438-024-00477-1
Khuder, B. (2025). Enhancing disciplinary voice through feedback-seeking in AI-assisted doctoral writing for publication. Applied Linguistics, 1-16. https://doi.org/10.1093/applin/amaf022
Kim, J., Yu, S., Detrick, R., & Li, N. (2024). Exploring students’ perspectives on generative AI-assisted academic writing. Education and Information Technologies, 30(1), 1265-1300. https://doi.org/10.1007/s10639-024-12878-7
Kostka, I. & Toncelli, R. (2023). Exploring applications of ChatGPT to English language teaching: Opportunities, challenges, and recommendations. The Electronic Journal for English as a Second Language (27)3, 1-19. https://doi.org/10.55593/ej.27107int
Kostka, I. & Toncelli, R. (2025, May). Maintaining student-teacher relationships in the age of AI. EduVerse Newsletter. 12-16. https://www.proed.com.vn/_files/ugd/4f4dcf_f81ec3a15b4440b990915eb33961004d.pdf
Kostka, I., Toncelli, R., & Fairfield, C. (2025, March 3). Red means stop and green means go: Creating AI guidelines with students. The FLT Magazine. https://www.doi.org/10.69732/WNWY9538
Liang, W., Yuksekgonul, M., Mao, Y., Wu, E., & Zou, J. (2023). GPT detectors are biased against non-native English writers. Patterns, 4(7), Article 100779. https://doi.org/10.1016/j.patter.2023.100779
Liu, D.Y.T., & Bates, S. (2025). Generative AI in higher education: Current practices and ways forward. [White paper]. ‘Generative AI in education: Opportunities, challenges and future directions in Asia and the Pacific’ project. https://www.apru.org/resources_report/whitepaper-generative-ai-in-higher-education-current-practices-and-ways-forward/
Lund, B.D., Lee, T.H., Mannuru, N.R., & Arutla, N. (2025). AI and academic integrity: Exploring student perceptions and implications for higher education. Journal of Academic Ethics, 23, 1545-1565. https://doi.org/10.1007/s10805-025-09613-3
Luo, J. (2024). How does GenAI affect trust in teacher-student relationships? Insights from students’ assessment experiences. Teaching in Higher Education, 1–16. https://doi.org/10.1080/13562517.2024.2341005
Moorhouse, B.L., Wan, Y., Wu, C., Wu, M., & Ho, T.Y. (2025). Generative AI tools and empowerment in L2 writing. System, 133, Article 103779, 1-13. https://doi.org/10.1016/j.system.2025.103779.
Moorhouse, B.L., Yeo, M.A., & Wan, Y. (2023). Generative AI tools and assessment: Guidelines of the world’s top-ranking universities. Computers and Education Open, 5, 1-10. https://doi.org/10.1016/j.caeo.2023.100151
Moorhouse, B. L. (2024). Beginning and first-year language teachers’ readiness for the generative AI age. Computers and Education. Artificial Intelligence, 6, Article 100201, 1-8. https://doi.org/10.1016/j.caeai.2024.100201
Nañola, E. L., Arroyo, R. L., Hermosura, N. J. T., Ragil, M., Sabanal, J. N. U., & Mendoza, H. B. (2025). Recognizing the artificial: A comparative voice analysis of AI-Generated and L2 undergraduate student-authored academic essays. System (Linköping), 130, Article 103611, 1-12. https://doi.org/10.1016/j.system.2025.103611
Organisation for Economic Co-operation and Development. (2017). The OECD handbook for innovative learning environments. Paris.
Ou, A. W., Khuder, B., Franzetti, S., & Negretti, R. (2024). Conceptualising and cultivating Critical GAI Literacy in doctoral academic writing. Journal of Second Language Writing, 66, Article 101156. https://doi.org/10.1016/j.jslw.2024.101156
Ouyang, F., & Jiao, P. (2021). Artificial intelligence in education: The three paradigms. Computers and Education: Artificial Intelligence, 2, Article 100020. https://doi.org/10.1016/j.caeai.2021.100020
Pack., A., & Maloney, J. (2024). Using artificial intelligence in TESOL: Some ethical and pedagogical considerations. TESOL Quarterly, 58(2), 1007-1018. https://doi.org/10.1002/tesq.3320
Sandstead, M., & Kibler, A. (2025). Voice in L2 writing in the age of AI. Journal of Second Language Writing, 69, Article 101212. https://doi.org/10.1016/j.jslw.2025.101212
Su, Y., Lin, Y., & Lai, C. (2023). Collaborating with ChatGPT in argumentative writing classrooms. Assessing Writing, 57, Article 100752. https://doi.org/10.1016/j. asw.2023.100752
Toncelli, R., & Kostka, I. (2024). A love-hate relationship: Exploring faculty attitudes towards GenAI and its integration into teaching. International Journal of TESOL Studies, 6(3), 77-94. https://doi.org/10.58304/ijts.20240306
Toncelli, R., Kostka, I., McMartin-Miller, C., & Zhou, L., & Szelenyi, B. (2025). Charting new waters: Exploring the role of ChatGPT in post-secondary pathways programs. In Wang, C. & Tian, Z. (Eds.) Rethinking language education in the age of generative AI. (pp. 153-168). Routledge.
Ulla, M. B., Perales, W. F., & Busbus, S. O. (2023). “To generate or stop generating response”: Exploring EFL teachers’ perspectives on ChatGPT in English language teaching in Thailand. Learning Research and Practice, 9(2), 168–182. https://doi.org/10.1080/23735082.2023.2257252
United Nations Educational, Scientific, and Cultural Organization. (2024). AI competency framework for students. https://doi.org/10.54675/JKJB9835
U.S. Department of Education, Office of Educational Technology (2023). Artificial intelligence and future of teaching and learning: Insights and recommendations. https://tech.ed.gov/ai-future-of-teaching-and-learning/
Wang, C. (2024). Exploring students’ generative AI-assisted writing processes: Perceptions and experiences from native and nonnative English speakers. Technology, Knowledge and Learning, 30, 1825-1846, https://doi.org/10.1007/s10758-024-09744-3
Yan, D. (2023). Impact of ChatGPT on learners in a L2 writing practicum: An exploratory investigation. Education and Information Technologies, 28, 13943–13967. https://doi.org/10.1007/s10639-023-11742-4
Zaheer, S., Ma, C., Zhu, Y., & Vasinda, S. (2025). GenAI in academic writing- empowering learners or redefining traditional pedagogical practices?: A systematic review From 2019-2023. International Journal of Artificial Intelligence, 1(1), 1–34. https://doi.org/10.4018/IJAITL.373582
| Copyright of articles rests with the authors. Please cite TESL-EJ appropriately. Editor’s Note: The HTML version contains no page numbers. Please use the PDF version of this article for citations. |

