AI in Education: Key Takeaways from the ASU GSV Summit

April 22, 2026

The ASU GSV Summit brought together education leaders, edtech companies, and researchers to discuss one of the most pressing questions of our time: how is AI reshaping learning, enrollment, and the future of work? Here are some of the main themes that emerged.

1.AI in the Classroom: From Experimentation to Practical Application

The higher education industry is facing a demographic reality: there are more colleges and universities than there are students to fill them. This enrollment cliff, more than a decade in the making, is pushing institutions to rethink how they talk to prospective students. Universities now have to be explicit about the outcomes students can achieve by attending, and tailor their pitch accordingly.

Additionally, there’s the phenomenon of AI in the classroom. For the past three to four years, much of the conversation around AI in education has centered on experimentation, proof-of-concept projects, and reluctance among professors and teachers. That is starting to change. The broader adoption of AI tools in real classroom settings is now visible, and the most widespread use case is arguably the AI tutor: every student now has access to one.

But that accessibility cuts both ways. Students have always looked for ways to make learning easier, and AI takes that one step further, sometimes too far. The concern, repeatedly raised, is that students are using AI in ways that bypass actual learning. The challenge, as one panelist put it, is harnessing the power of AI to help students without replacing the learning itself.

There’s also a gap between institutions. For regional universities, the reality of implementation is completely different from what flagship schools experience. Faculty are often telling students how to use AI without knowing how to incorporate it properly themselves. AI should reduce the pain of learning, not eliminate it entirely. There needs to be effort, no pain, no gain. The goal is to find the equilibrium point, pushing every student to their maximum potential without losing them, and this is a point that relates to one of the most nuanced conversations at the summit, of what’s called “productive struggle”, the idea that the effort required to learn something is not a bug, it’s a feature.

Productive struggle is built on several components: retrieval effort (the cognitive work of pulling something from memory), depth of processing (transforming an experience into long-term memory), error-driven learning (figuring out what went wrong), and generativity (the way experience continuously changes the brain). The concern is that AI is stripping out precisely these productive pieces.

When using AI, you sometimes get answers before you deserve them. There are good AI tools and bad AI tools, but the complexity of the learning ecosystem means no single tool can fully measure how a student is learning. The human in the loop is too often forgotten. The question of rigor in an AI-assisted environment is unresolved, but some practical approaches are emerging, for example: having students record and explain their thinking, with AI then providing feedback and a score on that explanation. These shifts are beginning to shape measurable AI-assisted learning outcomes.

2.AI is Transforming How Students Discover and Choose Universities

AI is also changing how prospective students find and evaluate schools. The discovery phase (figuring out what’s right for you) and the search phase (finding where to go) are increasingly separate processes, and most enrollment journeys are now student-led.

That said, there are still critical moments where students need to talk to a real person. The opportunity for institutions is to build AI-connected journeys that guide students toward the right information and identify the right moment to introduce a human in the loop. Students don’t want open-ended experiences; they want something clickable and guided. The human handoff remains the tricky part.

AI search engines operate differently from traditional search algorithms. They are drawn more heavily from the content institutions produce, which means that being discoverable through strong, relevant content is increasingly important. Institutions that carefully consider this have an advantage in reaching students early in their decision-making process.

3.Immersive Learning: The Case for XR

In this AI-first era, the conversation keeps returning to a core set of human skills: resilience, strong communication, and adaptability. After years of transformation, first the pandemic, now the AI boom, these qualities are more critical than ever, highlighting the growing importance of soft skills development in education.

Looking ahead to 2030–2031, the consensus is that employers will need people who can think critically, communicate clearly, and keep learning. The framing is increasingly soft-skills oriented: learn how to learn, learn how to engage in critical thinking, learn how to communicate. Humanistic studies and practical job preparation are not seen as opposites; they need to coexist. Teaching programs will need to be adaptable and built around structured thinking.

Part of the conversation focused on immersive learning, AR, VR, and mixed reality in the classroom. The data shared was compelling: studies show a 22% higher likelihood of retaining information when it is embedded in a narrative, a 12% improvement in student outcomes from VR integration in the classroom, and 20% more engagement. In controlled comparisons of VR versus desktop learning, more students retained information in VR, with fewer distractions and higher reported enjoyment.

That said, the panelists were careful not to oversell it. Immersion is not limited to AR, VR, or mixed reality; there is no single way to implement it. The failure mode is content that feels game-like but isn’t truly engaging or embedded in storytelling. The principle for educators: technology for a purpose, not technology for technology’s sake. The strongest use cases are for tasks that are difficult or dangerous in real life, such as offshore windmill repair or virtual simulations, where the immersive environment unlocks experiences that would otherwise be inaccessible.

The burden on educators to curate and choose the right experiences is real, and content developers have a significant opportunity to fill that gap.

4.AI in Research: From Restriction to Enabler

On the research side, AI and cloud computing, including quantum computing, have transformed what’s possible. What was once a constraint (computational needs) has become an enabler, allowing researchers to focus on what they do best. The shift has changed how researchers think about their work entirely.

One model that emerged from this: an AI ecosystem that brings together people who haven’t worked together before, cutting across application and computational layers. The key insight from one university initiative was to remove tech complexity from the equation and create a cross-collaborative environment, enabling faculty in non-tech domains to engage with AI tools in ways that are accessible and scalable.

Partnership is critical here, and so is bridging AI with business functions like HR and finance. The cornerstone is literacy, getting people to think about what technology can do, making them willing to try and explore. AI fluency across the campus, including microcredentials and hands-on project participation, is how institutions are building that capacity.

More Fuel for the AI Fire

Several panelists were candid about pitfalls. Chasing new shiny tools was flagged as a common mistake; for example, Agentic AI, at least in practice, is not as easy to implement as its advertising suggests.

Additionally, one of the bigger challenges highlighted was change management, as AI can process and surface enormous amounts of information, while students are learning differently and teachers’ roles are changing, too. Simply enabling employees or faculty with AI tools isn’t enough if there’s no clear incentive or focus on impact and outcomes.

Privacy and data standards also came up. Even when companies have good intentions, a framework is needed to think about how data is used, one that accounts for the fact that regulations vary by geography and are not autonomous, and that both students and institutions want to own their data.

The final note of caution: for developing brains, the question of whether AI causes harm is still open. Technology is a tool, and its impact depends on how it is used. The body of research is growing; however, it’s important to keep in mind one consistent theme in using AI: to bring people together and foster connection, because rich interactions are exactly where real learning and teaching happen.

Share this post
Trending Topics
Data & AI
Financial Services
Globant Experience
Healthcare & Life Sciences
Media & Entertainment
Salesforce

Subscribe to our newsletter

Receive the latests news, curated posts and highlights from us. We’ll never spam, we promise.

More From

Personalized learning is critical to accommodating individual learners. Educational technology makes that possible. EdTech enables students globally to have engaging and immersive learning experiences. In this space, learn about how EdTech is paving the way to better education through game-changing technology that is impacting the industry.