For universities adopting AI means ascending back to their core purpose
True value comes from universities looking into their origins prioritizing student-teacher-scholar connections, nurturing cross-disciplinary learning and independent thinking, leveraging the new tech
[part of this post was a draft letter to FT written in late Dec 2025]
Facing the inevitable, artificial intelligence (AI) in higher education is met with either absolute fear of destruction of the entire system or with an idyllic calm of smooth transition into a blissful future. And there is no lack of informed (and less so-informed) commentary on the subject (including one of my co-authored articles that studied the economic impacts of AI on human flourishing engaging with the works of classical economists and connecting with more contemporary trends).
The concerns are real. Students are said to be too relying on (some call that cheating) AI tools in their homework. Within seconds generative AI can produce almost anything from short to long complex essays to slide presentations and can come to rescue in answering exam questions. All that happens instantaneously without requiring even a minute of the user’s intellectual effort.

Live AI prompts have also helped job seekers wrestle through challenging virtual interviews. Examples are many and will continue to multiply. The fear though is tangible: defaulting to AI for assistance with even the smallest of tasks, seems to be voiding our abilities to think freely.
And it’s increasingly becoming not just a “students problem.” The generous use of generative AI has crept into academic research as well. Last year, for example, I received at least THREE referee requests where it quickly became apparent that the entirety of each paper was AI created, even preserving ChatGPT’s odd formatting. Mind, those requests came from respected journals.
Probably, books and book proposals are next… So, it’s going to get worse, as the saying goes, before it gets better. For behind the veil of convenience of AI generated responses hides the invisible influence of rigidly framing and directing of our thinking as scholars.
But let’s come back to the question of AI and higher education.
The two contrasting viewpoints above, destruction and bliss, are well-captured in Andrew Jack’s recent opinion piece (“Students embrace AI as schools tread carefully”, FT.com, December 3, 2025), reflecting on business schools attempt to adapt to the AI push. The challenge is urgent: just what should be the AI adoption model in today’s business schools and universities, more generally?
The answer seems to be twofold. First, there cannot be an expectation of a single universal model “that works”, despite some high-level common trends. This is due to the multifaceted diversity and distinctiveness of each higher-education institution in the modern competitive space. Lest we forget that higher education (especially in business fields) is an industry as well, filled with self-awareness, ambition, material rewards, and, naturally, competition.
A wide range of outcomes is more likely over the next few years and probably sooner than two decades (unlike some predictions). The aftermath of this individualistic approach across universities and business schools will be reflected in the changing content and quality of educational offerings as well as in future graduates’ job market preparedness. It is a dynamic process.
And it is this inherent dynamism of emerging technology’s onset on the higher education systems that leads to the second part of the answer. In sprinting for the ultimate AI adoption strategy, universities may be well-reminded of their original core purpose as communities of scholars, charged with preservation, development, and transmission of greater knowledge, strengthening each student’s intellectual creativity and capacity to independent thinking, in general and academic field specific.
To that end, adding technical advancements into classrooms; encouraging students to rely on AI tools for background research or running empirical tests helpful in complex systems analyses; promoting experiential learning practices; and other initiatives are the necessary steps to move forward. At the same time, simply attaching “…and AI” at the end of a new [existing] course name won’t do the trick.
The true value and game changing AI-adoption practice, instead, will come from universities looking back into their origins prioritizing student-teacher-scholar real-life connections; nurturing empathy, creativity, broad cross-disciplinary learning and prioritizing independent informed critical analysis, while leveraging the full potential of the emerging technology.
For today’s standardized curricula this approach is going to be difficult, complex, and energy demanding on both faculty, students, and administrators. It would require attention to detail and would put much trust to mold a well-rounded and socially conscientious and relevant educational proposition back into the faculty hands (vs automation of pedagogy).
Such approach would also require maximizing student-teacher direct engagement (smaller class sizes that require new faculty hiring?) and expanding on the diversity of such engagement techniques. Examples are many and may include (but certainly not limited to) frequent live-classroom debates and audience-engaging presentations (e.g., student-led thematic interview podcasts) instead of true/false quizzes in business classes; introducing oral examinations that seem to be gaining support (but the puzzle is how do administer and here we need another conversation).
Elsewhere, one could revert back to in-class-hand-written essays (where wrist motions stimulate the brain neurons aiding memory and learning) instead of multiple-choice exams.
The return to the analog is actually a step forward and pedagogically more effective in its broad developmental impacts vs. narrowly acquiring specialized knowledge. Other examples could be moving away from points-based grading system to a no letter-grade model, as is already experimented in some places, and more. The aim of such programs is to train eventual job market candidates capable of withstanding the idiosyncrasies of professional ups and downs over time with versatile skill-set that opens paths to meaningful careers.
And while some readers may agree with the above (though, perhaps not all), as recent commentators have written at greater lengths suggesting various pedagogical improvements in face of AI pressures, there is only so much that universities can do on their own. Perhaps the key to the multi-modal AI’s organic integration in our education systems goes to the pre-university experiences. And in that universities and [high] schools should find each other as mutually interdependent partners (but that too is a separate conversation).
Artificial Intelligence is net positive for the modern society and economy. Yet, it risks of descaling today’s education systems. The next few years will define just how the current phase of AI adoption in education will evolve, with obvious structural ramifications into the longer-term.
The solution is in some ways to go back to the analog world. Here, it is going back to the university’s conceptual core of open scholarship forums. And, the key here is to center the educational and pedagogical potentials on achieving real-life informed, open-minded, and analytical human interactions, capable of independent thinking while increasingly relying on the new technology.

