Key points:
- Ample human oversight is required to navigate the use of AI
- AI, edtech can improve equity in college admissions
- Can assessments be used to eliminate inequities in education? AI could help
- For more news on AI in education, visit eCN’s Teaching & Learning hub
Diversity, equity, and inclusion (DEI) efforts are having a moment, and not in a good way.
Corporate America is grappling with how to execute commitments to DEI initiatives while staying out of the crosshairs of those who would seek to regulate changes. This has led to some doubling down on fortifying DEI efforts while others are redirecting their path forward as they navigate this delicate environment.
Unfortunately, higher education is feeling the impact of shifting attitudes around DEI, too. Just last year, we saw the Supreme Court reverse affirmative action in college admissions, and as recently as March of this year, DEI programs and positions have again been under fire at universities and colleges across the country.
The efforts to lessen DEI initiatives in higher education are especially concerning. The pursuit of higher education should be filled with opportunities–to hear and learn from a wide variety of people with varied backgrounds, to share new experiences with fellow students, and to seek knowledge and skills that empower and prepare students for their chosen futures.
While course offerings, events, and programs focused on DEI are under threat in some parts of the country, serving all students should be paramount. Colleges and universities have a responsibility to ensure that all students’ needs are being met, all students feel seen and heard, and all students–regardless of cultural background, identity, primary spoken language, and vision, hearing or mobility differences–can access and participate in the learning experience, especially when so much of that experience is now taking place virtually.
Fortunately, technology is making some DEI commitments easier to achieve through artificial intelligence (AI). AI tech solutions have made impressive strides in connecting students with learning opportunities, and the tools at our disposal can meet students where they are, acting as a bridge to lectures, coursework, group projects, and much more. But we should not be too quick to adopt the latest tools solely for adoption’s sake, without vetting their efficacy and whether they genuinely deliver on their promises. And we must keep educators very much in the mix–to trust that any tool can replace human involvement is to make far too great a gamble with the college experience.
AI is a supplement, not a replacement for humans
The use of AI in education is not new, and various tools have been deployed to speed up administrative tasks, identify trends within data reports, and customize student learning journeys, for example. Used correctly, AI designed to empower and assist both administrators and students can be transformative, especially when the tools enhance diversity, equity, inclusion, and accessibility.
However, institutions grounded in helping students grow and learn must accept and commit to using these tools ethically and with full transparency. Weaving AI into education carries a substantial responsibility to become deeply familiar with not only its benefits but also its limitations–first and foremost, its inherent biases. That’s why, in tandem with incorporating a new AI tool, there must be a thorough understanding of the training (both training in how to use the tools, and the training of the tools themselves), the data collected, and how that data is subsequently protected. And only by spending time with a new AI tool can educators determine whether it truly delivers what they need.
Any technology provided by a college or university as a tool that students will engage with cannot exist in a vacuum, and this is where the importance of human oversight cannot be overstated. Only a real live educator, facilitator, or administrator can watch for missed nuances and step in to right the ship when technology is hindering a student’s progress instead of helping.
People are complex beings with utterly unique personal needs, quirks, backgrounds, and expectations, and no AI tool can replace human interaction and intervention.
How AI can, and can’t, improve DEI in higher education
Let’s look at an example of a tool to help educators identify where students are struggling by detecting emotion.
Emotion AI, also known as affective computing or sentiment analysis, uses video, audio, or text platforms to identify a student’s emotional signals while the student participates in a learning activity, and matches those signals with known emotions, such as stress, frustration, or confusion.
An educator might use emotion AI in an online course to gauge how students are faring. Here’s how each one works:
- Video: Determines emotional states based on facial expressions, gestures, and other movements
- Audio: Listens for vocal characteristics (speed, tone, and volume) to determine emotional states
- Text: Analyzes written language to understand the sentiment and emotional tone of the content
Emotion AI can be helpful in detecting a student who is struggling, meaning that student can get the additional support they need with human intervention. This can make a world of difference for someone whose background makes self-advocacy difficult.
The problem is that everyone expresses emotions differently. Emotion AI can gather insights, but like all technology, it is far from foolproof. A wide range of factors can impact how accurate Emotion AI can be, including:
- A physical disability can impact facial expressions and body movements, which could cause Video Emotion AI to falsely interpret emotions.
- Cultural differences play an important role in speech, behavior, gestures, and more. For example, eye contact is encouraged in some cultures, while avoiding eye contact is a sign of respect in others. Video Emotion AI could wrongly interpret a lack of eye contact as a red flag.
- Voice Emotion AI could inaccurately interpret dialect and accents that alter speech patterns and pronunciation, and in some cases, change the meaning of words altogether.
Take an informed approach
So, given all the factors, how do we use and trust these and other AI tools when there are so many potential pitfalls?
First and foremost, ample human oversight is required to navigate the use of AI, to minimize biases and to pick up on the nuance, context, and complexity of human beings as only a real person can do (and which AI cannot). But other steps are helpful too:
- Look for AI tools that have been trained on deep and diverse datasets to help the system be as adaptable, accurate, and inclusive as possible.
- Ask for feedback at every opportunity. Use anonymous and confidential surveys to gather student input and information about their experiences, accessibility needs, what’s working well for them, what needs to be improved, and when they would prefer to work directly with a person.
- Practice, practice, practice. Don’t blindly launch an AI tool without spending time with it and developing a genuine understanding of what it does, how it will be used, and how much human oversight it requires.
DEI is an ongoing journey–AI helps clear paths, educators lead the way
Higher education exists to help students grow, learn, and thrive. But when roadblocks make it difficult or impossible for a student to fully participate in lectures, test-taking, conversations, and activities, the college or university is failing that student.
AI clears many roadblocks to DEI, ensuring that institutions exist for everyone–but it isn’t perfect. It can manage data, but not nuance. It can predict academic success, but it cannot inspire learners. And it can help educators, but cannot replace them.
- GenAI and cultural competency: New priorities in teacher preparation - February 14, 2025
- The elephant in the room: 4 actionable tips to manage student cheating - February 12, 2025
- The ethical dilemma of AI in classrooms: Who decides what we teach? - February 10, 2025