- AI tools hold potential for teaching and learning, but they must not replace students’ original thoughts and methods of communication
- A love of learning can help students remain committed to original work, while still using AI as an educational tool
- See related article: How AI-based learning is changing higher education
With ChatGPT from OpenAI launched in November 2022, the discussions don’t stop around this tech’s influence on the education niche. Teachers continue to weigh the pros and cons, given the tool’s fascinating ability to craft text that looks like a human wrote it.
On the one hand, they understand AI is a game-changer in the classroom, and it’s worth embracing as a new educational tool. On the other hand, its potential harm to learning can’t remain ignored.
As a teacher with a degree in Linguistics, I don’t speak only about the issue of academic integrity here. While some teachers worry about how they can detect whether a student used AI to cheat on assignments, others’ concerns go deeper:
We know that technologies affect how people read, write, and think. Thus, reading has turned into scanning, and social media changed our communication skills. As for writing, AI tools may threaten this skill now: Possessing the ability to generate coherent essays, they negate the value of writing as a process responsible for critical thinking.
Sure thing, ChatGPT isn’t the one to blame. It’s just among the latest AI programs for text generating and editing, contributing to the discussion resumption with renewed vigor. We see how AI can impact student essay writing, especially in tech colleges, undermining other essential skills development.
Essays are a powerful educational tool. They promote critical thinking, teaching students to consume ideas and build arguments. More than that, essay writing encourages students to learn how to observe different perspectives, come up with positions, and draw conclusions. If younger students rely on AI tools while learning to write, their critical thinking development and motivation to learn could suffer.
Yes, sophisticated AI tools like Grammarly helping us with spelling and grammar issues and offering alternative wording are great. However, their adverse effects on younger students also appear: Relying on predictive texting, they lose spelling abilities and become less curious to search for creative writing solutions.
As one of my students said, “AI writing tools are for lazy people looking for easier ways out.”
Relying on AI tools too much, students lose the skill of expressing their thoughts and ideas in their own words. As a result, they can’t build arguments and communicate their positions logically and coherently.
Another form of critical thinking AI can eliminate is judgment. When using ChatGPT for research, many students trust the information it provides. They can make it a habit to filter out the information based on AI replies, though AI tools are not that accurate:
While they build answers from pre-existing online data, they can’t evaluate its relevancy and value. AI content generators can present false facts from the internet as if it’s true and share biased or inaccurate answers to some issues.
Given that, teachers shouldn’t encourage mentees’ dependence on AI for fact-checking. It may result in students accepting wrong info for decision-making.
If students become over-reliable on ChatGPT, they don’t develop independent thinking and problem-solving abilities. They don’t know how to evaluate different sources of information to use (or not) for essay references. Over time, it may also hurt the ability to think outside the box, which is essential for subject learning in school or out-of-classroom activities.
One more issue I’d mention here:
AI tools destroy a student’s writing voice. Rather than using their unique style, they rely too much on a tool’s suggestions that are, again, about predictive texting, not creativity and writing style. While such texts are free of grammar mistakes, they are “dry,” “formal,” and “emotionless.”
That’s what philosopher Evan Selinger worried about eight years ago, discussing predictive texting’s influence on how people would interact. Selinger mentioned the tech “reduced the power of writing as a form of mental activity” and could “stop us thinking.”
In plain English, texts and conversations become “more algorithm and less of ourselves.” Tools like ChatGPT give a finished product, a grammatically perfect text lacking personal expression. If completing writing assignments using AI, younger students don’t learn to communicate: They miss practicing writing as a mental process, which is what school writing assignments should teach. More than that, students seek writing help in reviews, willing to find a resource who would complete assignments for them.
Essay writing assignments are about ongoing dialogue between a teacher and a student. They discuss a topic, thesis, and first drafts; students rethink the concepts and revise arguments. The problem is that most teachers ignore that:
They assign an essay, provide requirements, and that is it. While conscientious students may undertake all of the above themselves, the temptation to rely on AI tools is high.
Our mentees are ready to give up opportunities to think and learn for instant decisions the techs provide. Even if schools start teaching about AI tools’ pros and cons, students will hardly “forget” about Grammarly or ChatGPT being able to polish writing assignments for them.
What shall we do instead? Instill a love of learning in students of primary and secondary schools and help them find curiosity and motivation to continue studying. AI won’t go anywhere, so let’s recognize the opportunities it brings for education instead of blaming students for using it. It’s time to re-image what and how we teach and why.
- Generative AI can enhance equity of access and attainment - December 7, 2023
- The academic implications of AI in student writing - December 5, 2023
- New program aims to train more male educators of color - December 4, 2023