Artificial intelligence (AI) is certainly not a new concept in education, but new advances in AI capabilities can raise important questions for educators, as evidenced by OpenAI’s ChatGPT tool. With ChatGPT’s advanced abilities, many higher-ed educators are concerned about academic integrity and wonder how to best integrate the tool into their instruction without completely banning its use.
A panel of academic leaders and faculty, moderated by Anthology academics, discussed these very issues and examined ChatGPT’s impact on instruction, academic integrity, research, and student support.
Should higher ed view ChatGPT and other AI chatbots as cheating? Are there other approaches to the issue?
“I think [higher-ed faculty] are going to be very worried about students plagiarizing,” said Bryan Alexander, Senior Scholar at Georgetown University. “I don’t think this is the entire population, but I think the plagiarism arms race is definitely off and running, and will continue to run for some time.”
“My experience has been that faculty come from one of three perspectives,” said Suzanne Tapp, Assistant Vice Provost of Faculty Success and Executive Director of the Teaching, Learning, and Professional Development Center at Texas Tech University. “Maybe it’s a ‘fight-in’ perspective where they’re concerned, with good reasons, about what happens to academic dishonesty with the entry of such easily-accessed AI tools. Maybe it’s the ‘come use it’ perspective and they’re ready to jump in, or maybe they’re somewhere in the open-minded middle, watching to see what happens. And I think it’s fair to approach it from any of those perspectives.”
There are some amazing and innovative AI-related practices emerging from faculty. What have you seen from your colleagues in higher ed?
“It may seem simple, but it’s not. Using the umbrella term of ChatGPT, assuming it covers a wide range, it’s suggesting a paradigm shift and is reminding us to ask students to be critical thinkers and risk takers,” Tapp said. “We’ve had this shift in education, away from teacher-centered pedagogy to a student-centered pedagogy, and AI is going to push us further that way. I’m observing faculty on campus doing this supposedly small thing that’s actually quite big—they’re asking students what they think about ChatGPT, what they want to do with it, and how it changes our world.”
“There are lots of different ways to approach this, and we’re seeing this reflected in faculty responses,” Alexander said. If we think about ChatGPT and similar chatbots, they produce text. In higher education, we tend to think of that as writing papers. Do we have students use ChatGPT as co-authors? Another aspect is to turn things around and have ChatGPT write a paper for us and have students critique it—what’s wrong with it? What’s right? A third is to be even more creative and press the limits of what ChatGPT can do.”
Pushing ChatGPT’s limits, Alexander said, might look like this:
- Write a paper about the impact of the greenhouse effect on economies in sub-Saharan Africa, then write it in Shakespearean, then make it rhyme.
- Think about ChatGPT as a more multi-capacity tool, not just for writing. For example, have it perform work as a simulation tool. Create a teaching simulation, simulate students, set up a classroom, assess how the instructor did, reproduce how they respond, and show the effects of that.
“I think we see quite a few different options—as a personal trainer, as a tutor, as a co-creator of something other than writing. There’s a lot of room for exploration,” he added.
AI can certainly play a role in automating the “easy things,” freeing up instructors’ time for other tasks, said Szymon Machajewski, Assistant Director of Learning Technologies & Instructional Innovation at the Center for the Advancement of Teaching Excellence at the University of Chicago Illinois. For example, an instructor could ask ChatGPT to create a 5-question multiple choice quiz for students or to personalize certain assignments and tasks based on students’ needs. But where it really shines, he said, is in accessibility.
“The biggest opportunity I see so far is in accessibility. Think about a student using JAWS or NVDA and browsing your course page takes a long time. Now, the student can have AI process the page and ask questions about [elements such as] images,” Machajewski said. “For a student with disabilities to be able to have a dialog with a textbook and with content the instructor has deployed—it’s tremendous progress. People with dyslexia are probably going to shine—the step of reading and comprehending will be turned upside down; you can immediately follow up with questions.”
How do we find a balance between AI and human intelligence, and how do we work in harmony with these systems moving forward?
“When I think about what I want to do with my students, I think about critical reflection, collaboration, and critical thinking,” Tapp said. “These are the skills I want to hone in my students. There are other tasks that AI can do for me that can be used to check knowledge—I’m thinking of tools like feedback. I’d rather use AI to do some of the ‘busywork’ of teaching [to let me] emphasize those deeper skills with my students.”
“I think it’s excessively clear that one of the huge things AI will do is tutor,” Machajewski said. “Looking back at Bloom’s 2 sigma problem, the most productive way you can increase a grade by two letter grades is one-on-one tutoring. How can we [ensure] that AI is mature enough so that when students explore it, they’re actually benefitting? The idea of tutoring one-on-one, digging deep, and having interactive experiences with a course will be huge.”
“Tutoring has long been a dream of people in tech. This is something we’re already starting to see unfold,” Alexander said. “I think we’ll definitely see a simple grading replacement—a swap of software for human time. That may lead to reducing work for faculty.
“We have to, in theory, teach spelling and basic grammar. A tool like ChatGPT can help students get going, and we can focus on other things like student voice and creative writing,” Alexander added. “We really have to rethink assessment as a whole. There’s potential for a great deal of creativity. I’d love to have students share their opinions and be involved in the design process.”
Will ChatGPT replace faculty? What’s the role of faculty as AI chatbots progress?
“Education is very much a human process,” Machajewski said. “Academics don’t need to be afraid of replacement, but we do have to learn who is teaching the AI. That’s going to be a main predictor of how the models work. Certainly, bias will be a big part of that. The teachers of AI, and who will assume that role, are more important factors than if AI will replace faculty.”
How do we make sure that what we are leveraging these tools to do isn’t inherently biased?
“There are several ways. One is to have more diverse representation among the teams constructing these tools,” Alexander said. “Another is to be conscious about this. The third is to work on the source material we get and make sure it’s equitable. That can take some time. I’m heartened that ChatGPT and others have demonstrated guiderails that have tried to reduce that kind of thing.”
“We have a responsibility as educators to help our students understand that we’re working with something that in most cases is predicting text, not generating knowledge,” Tapp said. “And we have to ask our students to ask questions, to identify how that text was generated.”
- Students say their biggest obstacle is being unprepared for courses - November 30, 2023
- New report reveals persistent gender disparities in college, career readiness - November 28, 2023
- Wellness centers grow as institutions focus on mental health - November 27, 2023