World AI Week represents a unique moment to reflect on how AI has changed the way we work, live, and learn. Many brilliant minds and progressive companies are gathering for the World Summit AI (WSAI) in early October to discuss AI’s potential, and among the many topics of discussion, the role of AI in education will certainly be a hot topic.
The global COVID-19 pandemic has greatly shifted the landscape of technology in education. Overnight, learning worldwide became remote and distributed. What were once simple tasks like distributing and collecting homework or administering an in-class midterm became complex technological challenges, for which our collective tools fell short. However, we also made great strides in closing the gap, and along the way developed, improved, and scaled our technology and products in ways that we never thought possible. We’re proud of how our AI helped instructors by assisting in grouping similar answers to accelerate feedback, our state-of-the-art handwriting recognition that brings a new level of offline-to-online accessibility to our cutting-edge platform and our writing AI that helps students improve the way they write and use citations.
To continue to add a new perspective to this conversation, I asked a number of AI in education experts as well as two faculty members for their thoughts about AI’s role and potential in education.
Diving deeper into employing AI as an assisting technology, I first asked Rich Ross, PhD, who is an assistant professor at the University of Virginia for his thoughts. He told me what I’ve heard time and time again from faculty. “While many are developing ways to have AI replace interactions, I think that AI should be viewed through the lens of maximizing each interaction: students interacting with each other, and instructors interacting with students.” When I asked him about points of caution, he added, “To me, the biggest danger with AI and related tools is that we’ll reduce the quality and frequency of these interactions without reducing the tedium of many tasks that we should automate. Ideally, AI would automate most or all of the ‘low-impact’ activities and allow me to maximize time spent helping students understand and synthesize the concepts in my courses.”
To get a business leader’s perspective, one with a long list of credentials in AI, I asked Nathan Thompson, PhD, the CEO and Co-Founder at Assessment Systems to chime in. “The role of artificial intelligence in higher education is still not fully explored and realized,” he said. “Even though the pandemic certainly hastened the adoption of many technologies. The greatest opportunity is that it can help to bring quality education to more people, by improving relevant components such as placement, instruction, and assessment. For example, adaptive placement testing and adaptive learning could drastically reduce the time needed for a student to achieve some certain level of skills or educational degree.”
True to his insightful nature, Nate anchored his comments with a personal connection. He said, “The greatest danger, or perhaps remaining opportunity, is how to best leverage the human element of higher education. Personally, the greatest value I received was not what I learned in textbooks, but the mentorship from key professors that guided me into my future profession that I now love, and changed the way that I think. How can AI make faculty more efficient so that they continue to make such an impact?”
Another faculty member, Jenny Amos, PhD and teaching professor and Laura Hahn Faculty Fellow at the University of Illinois Urbana-Champaign, gave me very specific guidance on where she thought AI had its best potential. “Artificial Intelligence can be used to help us determine student pathways to success that were not readily apparent,” she said. “For instance, is a particular pre-requisite course really needed to proceed in the curriculum? What other potential pathways exist? This type of thinking could have huge advantages to rethink pre-requisites in terms of whole courses and, instead, see them as a list of skills and knowledge needed with more flexible pathways to achievement.”
Jenny, having the unique ability to connect her acumen with data to human insight added this. “A potential danger of AI in education is an overreliance on algorithms to predict final course scores early in a course. These algorithms do not take into account the growth potential in the human mind to change behaviors and change the course moving forward. While input from algorithms like these can be used to notify faculty of student progress as early warning signs, they also represent confounding sources of information and are not deterministic on their own, just a starting off point for conversation.”
And finally, I asked Melvin Hines, who is the CEO and Founder of Upswing, which is a technology to connect at-risk students with campus services. What he said is an even stronger testimony for our collective hope that AI enhances human relations rather than supplants them. “As colleges are struggling to do more with less, and particularly as community colleges find themselves competing increasingly with well-moneyed national institutions for students in their local neighborhoods, AI tools will move from being a nice luxury to a survival necessity. However, AI can’t be used to replace the human touch.”
Melvin added this important note. “As within other industries, forcing students to interact with inadequate AI tools will only serve to increase student dropouts. My dream would be for AI to be used to help students navigate the myriad of problems they must overcome in order to be a student. It would also allow administrators to recognize where students face barriers and how they can decrease them.”
Reflecting on these insights, I think the fulcrum here is timing. At this moment in time, the use of AI should be to reduce repetitive administrative tasks and streamline whatever hinders the opportunity for faculty and students to meaningfully interact with each other.
That is the heart of our technology and the new applications we are developing. My caution is that AI is not infallible when asked to perform complex human judgements, especially ones where a human decision maker would blend intellect with empathy and intuition. AI is trained on datasets that are based on human behavior, and if biases exist in the data, AI models will build on these biases to make predictions because that is literally what they are incentivized to do. Therefore, AI can amplify underlying bias when it is used without care. Institutions can help mitigate these issues by working with companies and partners that are committed to building fairness, accessibility, and inclusivity in all aspects of product design, rather than as an afterthought.
Recent events have brought conversations around the role of AI in human societies to the forefront – and I’m sure will be a hotly discussed topic at the World AI Summit. A question I’ve been asked repeatedly is whether the AI at the center of the controversy is broken and needs to be fixed. In truth, that AI – like many large fully automated AI systems – is doing exactly what it’s been created to do: drive engagement. Thus, my advice is this: think deeply about how we are incentivizing our AIs to behave through the cost functions and mathematical architectures we design. What might AI designed to enhance human potential–rather than automate away human interaction or choice–look like? What would a world in which AI uplifts all people, and levels historically uneven playing fields look like?
I believe that such a world is not only possible, but just around the corner; however, to get there we must recognize that unlike traditional software, where people program computers, AI without human oversight tends to program people. We should focus our attention on building AI that works with and for people to help every person in the world fulfill their own enormous potential and improve learning outcomes.
- What matters most in college selection–and how to respond accordingly - February 19, 2024
- Navigating the future of higher education: A graduate student’s perspective - February 16, 2024
- How to weave video game principles into the classroom - February 15, 2024