Key points:
- AI should amplify curiosity rather than erode it
- AI career mentors: Why students trust algorithms more than teachers
- Using AI and predictive analytics to safeguard student success
- For more on AI’s potential in learning, visit eCN’s AI in Education hub
Higher education stands at a defining moment. For decades, universities have been shaped by a dual mission to disseminate knowledge through “teaching” and to create it through “research.” Too often, these functions have been treated as separate entities–teaching for undergraduates, and research for graduate students and faculty. But this division misses a deeper truth. At its core, research is not just the creation of new knowledge, deriving and interpreting data and discerning trends, but is the most powerful way of learning we have ever known.
Discovery and inquiry embody the human spirit of curiosity–the drive to ask, to test, and to fail, and the joy of making connections and developing an understanding. Discovery and inquiry, the basis of research, are what transform education from rote memorization and the “sage on the stage” model into general curiosity and creativity, enabling critical thinking at its very best. The exhilaration of pursuing a question, the frustration of false starts, and the joy of insight–these are the experiences that produce deep and lasting learning. When students at any level are invited to engage in research, not simply to learn what others have discovered but to enquire, hypothesize, test, and reflect, they learn in deeper, more enriching, and more enduring ways. The thrill of asking a question and the satisfaction of developing the answer oneself, the frustration of failed experiments, and the exhilaration of insight and then success–all of these represent learning at its most powerful.
Much has been written about AI as a tutor, a note taker, and a generator of summaries–or even in its perceived role as the catalyst for increased cheating and loss in creativity. But its deepest implications may well not be in these areas. They lie in how AI changes the very nature of inquiry and how questions are asked, how evidence is tested, and how insights are generated and communicated. In short, AI could well be the best means of enhancing learning at scale. AI has the potential to reposition discovery as the center of education itself, making inquiry a lived experience for all learners, while also transforming the practice of advanced scholarship. To harness this potential, we must rethink what it means to learn, to research, and to teach in an AI-enabled world. If we seize this moment wisely, AI can catalyze the re-envisioning of higher education, recentering discovery as the core of learning for all students while accelerating the creation of new knowledge. If we do not, we risk producing graduates trained for a world that no longer exists and research cultures built on expedience rather than integrity.
From repetition to insight
Much of research training mirrors that of traditional education, defined by rote memorization, the repetition of problems over and over until mastered, moving then to small steps forward and variations on a theme. Graduate students are often tasked with running the same analysis dozens of times, recalibrating parameters, and combing through literature to extract relevant citations. These tasks, while necessary, can consume disproportionate amounts of intellectual energy and time, often replacing innovation and creativity by forced incrementalism. Students do learn patience, persistence, and even resilience, but they rarely engage in creativity or critical thinking over any significant period of time. AI shifts this balance. It can run the repetitive analyses, scan terabytes of information in moments, and identify patterns across massive datasets. This does not trivialize research or learning. It liberates it. When a machine takes on repetition, humans can focus on interpretation, hypothesis refinement, and the next step of discovery. This does not decrease the value of the review but allows a student to refocus efforts on what is truly important. For students, this transition can be profound. Instead of equating learning with drudgery, they can engage more quickly in the joy of conceptual exploration and understanding through “what-if” scenarios. The challenge for universities is to redesign curricula and mentoring so that education remains rigorous, and not less demanding, but differently demanding. Students must learn how to use AI critically, when to trust its output, and how to interrogate its assumptions. Training must focus less on process execution and more on judgment, creativity, and ethics.
If this balance is not struck, two dangers emerge. Students may become overly dependent on AI, accepting its outputs uncritically, or they may be excluded altogether, unprepared for an increasingly AI driven world. Both outcomes undermine the essence of education and learning. The opportunity is to prepare a new generation of learners and scholars who engage as collaborators, human-to-human, human-to-groups of humans, and human-to-machine, but retain direction and ownership of discovery and the extent and process of learning.
From silos to synthesis
One of the perennial challenges in education and research is disciplinary siloing. Universities celebrate interdisciplinarity, yet real collaboration is constrained by disciplinary bounds, ranging from programs and departments to colleges and centers. Often explained as related to the requirements of accreditation and the job market, the constraints have more to do with maintaining archaic institutional structures, as well as the revenue needed to keep faculty employed despite programs that may be obsolete and non-responsive to the current and future workplace. Similar structures exist in research. Both hamper the advancement of learning and discovery due to barriers blocking learners’ ability to cross disciplinary bounds. AI, however, opens new vistas for inter- and multi-disciplinary engagement, moving away from artificial constraints and enabling learners to benefit from the synthesis of information pertinent to the question/hypothesis under consideration. In addition, progress is often hampered by fundamental misunderstandings. Terms mean different things across fields. A model in political science is not the same as a model in structural engineering. AI can mediate these divides by drawing from vast cross-disciplinary corpora, mapping the semantic terrain, disaggregating meanings, and identifying where concepts overlap or diverge. More than translation, this is synthesis, connecting methods, frameworks, and insights across fields that would otherwise struggle to collaborate. Imagine a research team working on urban resilience, or a group of students studying resilience of cities. Civil engineers model infrastructure performance, sociologists study community adaptation, economists rationalize resource allocation, and public health scholars assess well-being. Traditionally, each discipline “teaches” within narrow disciplinary bounds, and contributes its perspective with integration happening, if at all, at the margins. With AI as the synthesizer and the interpreter, connections can be drawn more systematically in terms of how infrastructure failure affects social cohesion, how resource constraints interact with health outcomes, or how design decisions cascade across systems. The result is not simply additive. It defines the epitome of transdisciplinary learning and discovery, when systems level understanding, and even new knowledge arises at these intersections. AI makes the synthesis more feasible, but it does not do it alone. Human scholars must still exercise judgment, ensuring that connections are meaningful rather than superficial. The opportunity is to reimagine collaboration as genuinely integrative with AI as the catalyst.
From assumption to critique
All learners benefit from constructive criticism and critique. In fact, the most effective learning often takes place when a student is given guidance on a first attempt/draft and encouraged to further improve on the report or assignment. Similarly in research, the peer review process, while imperfect, still serves as a safeguard, testing claims and strengthening arguments. Yet peer review is slow, uneven, and even limited by the availability of qualified reviewers. Learning through repeated reviews requires significant time and qualified faculty, and hence this mode is rarely used as a means of learning. AI changes all that, enabling critique earlier and more consistently, testing the coherence of arguments, identifying logical gaps, flagging inconsistencies between data and claims, and comparing drafts to the existing literature to highlight overlaps or omissions. The role is not about replacing people, whose expertise and judgment remain irreplaceable. Neither is it about reducing the work or thought process of the learner. Rather, it is about embedding critique into the learning and research process itself for faster improvement and refinement. For students, this introduces a valuable learning loop. They can receive immediate, constructive feedback, learning not only what works, but why, and even building arguments to back up direction, choice, and hypotheses. The risk, of course, is that AI critique may become formulaic or overly conservative, reinforcing existing patterns rather than enabling creativity and innovation. Safeguards are needed to ensure that AI challenges assumptions without constraining creativity. But the potential is immense. AI can normalize a culture where critique is not dreaded but welcomed as part of enquiry and learning.
From scarcity to exploration
Historically, discovery and learning have been constrained by scarcity. Time, resources, and even computing power limit how many hypotheses can be tested, how many scenarios explored, and how many models simulated. Scholars must choose carefully which questions to pursue, often abandoning promising leads for lack of capacity. Similarly, learners must restrict themselves to one or two scenarios, extrapolating from these to an entire body of knowledge. AI transforms the scarcity into abundance. It can simulate dozens of scenarios in parallel, generate variations of hypotheses, and analyze streams of data simultaneously. In climate modeling, this means testing innumerable “what-if” cases to understand potential futures. In structural engineering, it means simulating multiple collapse scenarios to learn not only what failed but what could fail under different conditions. In genomics, it means scanning vast datasets to identify patterns that no human could detect unaided. Consider the power of learners at any stage having access to such tools. This abundance does not make human researchers obsolete; it makes their judgment more essential. Faced with a flood of possibilities, scholars must decide which pathways merit pursuit, which are distractions, and which open new frontiers. For students, this abundance offers something equally important–the freedom to fail safely. AI-enabled simulations allow learners to test ideas, pursue dead ends, and learn from failure without the high cost of physical experiments and years pursuing dead ends. In this sense, AI can democratize the joy of discovery, allowing enquiry to flourish not only in research laboratories, but in classrooms and learning spaces.
From isolation to communication
Research does not end with discovery, because it must be communicated. Yet academia has long struggled to translate specialized findings into accessible knowledge. Dense prose, disciplinary jargon, and limited outreach leave much scholarship disconnected from the public. AI can assist in bridging this gap. It can generate summaries for different audiences, adapt dominant styles, and create visualizations that make complex concepts comprehensible for students. Learning to communicate discovery is part of learning itself. By working with AI tools that generate drafts, visuals, and translations, students can practice refining and contextualizing their work. AI becomes a partner in teaching not only how to discover but how to share discovery responsibly. Again, responsibility and agency are key. Simplification must not become distortion and accessibility must not come at the cost of accuracy. Scholars must guide AI outputs, ensuring that they enhance clarity while preserving integrity. Done well, AI can help reconnect academia with society, rebuilding trust in the value of higher education and research, and ensuring that the process of true learning through discovery and enquiry flourishes.
Research as the core of learning
Perhaps the most important shift AI enables is not about research, but about education. If discovery is the most powerful form of learning, then research should be at the heart of the student experience. Too often, students are trained to absorb knowledge rather than to generate it. They memorize facts but rarely learn to ask questions. They reproduce answers but seldom test hypotheses. This is not because they lack curiosity, but because the structures of education do not prioritize inquiry. AI offers a chance to change this by lowering the barriers to discovery and inquiry, providing tools for simulation, analysis, critique, and communication. AI makes discovery accessible at the earliest stages of education. Undergraduates can engage in inquiry not only as part the mandatory capstone project, or the rarer structured summer undergraduate research experience, but as a mode of learning right from the outset. Even high school students can test ideas, simulate outcomes, and learn through exploration. This democratization of enquiry does not diminish the rigor of learning. Rather, it strengthens it. When discovery becomes the norm of education, students enter graduate study and professional life with habits of curiosity, critical thinking, and resilience. They see learning not as passive consumption but as active participation in the creation of knowledge.
The implications for institutions are profound if research can be recentered as the core of learning. Universities must necessarily reenvision their structures. This means we must train our students not just to conduct research, but to lead inquiry in an AI-driven world. It emphasizes embedding inquiry into undergraduate education, making learning by doing and through active discovery the default mode of learning. It necessitates a shift from lecturing and teaching to learning. It requires equipping faculty to guide discovery with AI as a collaborator rather than a competitor. And it necessitates redefining outputs, recognizing that knowledge should not be measured only in terms of final exams successfully completed, or papers published, but rather as questions pursued, hypotheses tested, and insights generated collaboratively. Assessment is then no longer by a single grade at the end of the semester but through outcomes and the demonstration of learning. In short, the research university of the future must become an inquiry university, where learning and discovery are inseparable and where AI is integrated as a catalyst, not as the end point
If we allow AI to become a substitute for curiosity, to automate answers without cultivating questions, we will destroy the essence of inquiry. But if we harness the power of AI platforms as a collaborator, interpreter, critic, and simulator, we can recenter research/discovery as the foundation of learning for all students The joy of discovery, the moment of insight when patterns emerge, when hypotheses hold, when ideas crystallize–all remain incredibly human, and this is what drives scholars across all disciplines. AI can accelerate that process, but it cannot replace the human experience of wonder. The task of higher education is to ensure that AI amplifies curiosity rather than erodes it, expands vistas rather than narrows them, provides opportunity to build critical thinking abilities rather than stymies it, and strengthens the integrity of learning rather than undermining it. Learning, rather than teaching, must once again become the heartbeat of education. AI, if guided by human judgment and responsibility, can serve at its most powerful catalyst.
- The digital divide redux: Why AI is the new broadband - February 6, 2026
- Rethinking higher education enrollment trends for a plateau era - February 2, 2026
- Rebuild or reuse? Making strategic space decisions that last - January 30, 2026
