This post is the fourth in the Beyond ChatGPT series about generative AI’s impact on learning. In the previous posts, I discussed how generative AI has moved beyond text generation and is starting to impact critical skills like reading, note-taking, and impact relationships by automating feedback. In this post, I’ll cover how the technology is marketed to students and educators to tutor students. The goal of this series is to explore AI beyond ChatGPT and consider how this emerging technology is transforming not simply writing, but many of the core skills we associate with learning. Educators must shift our discourse away from ChatGPT’s disruption of assessments and begin to grapple with what generative AI means for teaching and learning.
Beyond ChatGPT Series
Note Taking: AI’s Promise to Pay Attention for You
The Automation of Learning
The risks of adopting generative AI and putting it into our classrooms without structure or guidance aren’t just regulatory or based on data privacy. These new models are no longer simply models of language (LLMs) but have developed into Large Multimodal Models (LMMs) that use simulated voices, vision, and algorithms designed to mimic the human expression and understanding of emotion. Tyler Harper’s recent essay for the Atlantic The Big AI Risk Not Enough People Are Seeing lays bare the stakes for widespread uncritical AI adoption throughout society: “Artificial intelligence could significantly diminish humanity, even if machines never ascend to superintelligence, by sapping the ability of human beings to do human things.”
This week, Microsoft announced a new partnership with Khan Academy to bring their chatbot Khanmigo to educators for free. With OpenAI announcing they are committed to releasing their new GPT-4o model for free, this news removes the last financial barriers for people to adopt LMMs. One thing was clear from OpenAI’s GPT-4o demo—they want ChatGPT to be even more frictionless for users.
Friction Matters in Learning
The problem with using AI assistants as tutors for learning is how quickly certain people have made the leap from using a tool to augment teaching into a full-on replacement for it. This is most apparent in the weird experiment at the Alpha school in Austin to replace teachers with AI and relegate the role of human beings as motivational “guides.”
The founder of Alpha, MacKenzie Price, sees teachers as redundant now that AI can provide instruction for students. She envisions a future of education where students learn for only two hours a day via AI and complete self-directed tasks on their own, with guides helping students realize their potential. Instead of addressing the human elements that make instruction complex, AI is embraced as the solution to bypass the labor involved in learning.
It isn’t too far of a leap to see a system like this being adopted on a much wider scale, now that OpenAI is marketing their new GPT-4o’s multimodal capabilities as being able to empathize with users by watching their facial expressions. Replacing teachers with bots is a dystopian solution to the problems that plague education and shouldn’t be uncritically embraced.
One way we learn is through friction. Contending with experiences that require multiple steps, time between, and applying previous knowledge with new knowledge helps ensure students learn material and reflection asks them to pause and take account of what that learning meant to them. We have no clue what the downstream consequences of inviting AI to provide an answer to a student each time they have a question. All the marketing from these massive AI firms talk endlessly and often in circles about how this marvelous technology will usher in a new era of human flourishing, but what flourishes when we automate learning itself?
It is the Wild West of AI in Schools
Administrators may turn to automation to save money, but we’re increasingly seeing students being marketed AI assistants as tutors to help them learn and teachers singing the praises of AI teaching assistants as time savers. It is the Wild West of teachers adopting generative AI systems in the K-16 landscape and few seem to care about whether these tools are compliant with current regulations governing AI in education or what the long-term effects are on relationships.
Teachers are using AI tools like Khanmigo to create lesson plans, assignments, quizzes, and using AI to help them answer questions that students struggle with. While the above use cases can certainly help support an educator serve the learning needs of their students, what we aren’t seeing is the careful or cautious integration of these tools across academia. A common theme found in social media pitches from influencers selling AI tutoring apps: “Hey, you’re overworked. Here’s a free app that uses AI that will do part of your job for you.”
Enable 3rd party cookies or use another browser
Students are likewise being sold AI to not only save them time, but influencers are heavily marketing AI to students as superior stand-ins for instruction from human educators. Yes, AI has now crossed into the dizzying political space of school choice and highly polarized arguments about the quality and value education itself provides.
Enable 3rd party cookies or use another browser
Enable 3rd party cookies or use another browser
How You View Education Matters In Our AI Era
A sizable number of people view education as an entirely transactional relationship—learning as a means to an end. For this crowd, using AI to replace traditional teaching or augment it is perfectly acceptable, even warranted. They don’t see learning as intrinsically valuable or think about the human relationships at the heart of teaching. Critical thinking, ethical decision-making, and contending with views different than your own aren’t on the agenda of a transactionalist view of education. Removing any barrier to getting a degree matters most to these folks, so too, does ensuring a student consumer likes the content they’re provided. And that is a much deeper problem.
We know that AI tutors can be easily programmed with a user’s preference. Don’t like learning from a female teacher? No problem, your personalized tutor will only respond to you as a male. The same goes for your political flavor of human interaction. Never want to hear a conservative or liberal perspective on something? AI can likewise be programmed to give the user their own bespoke politicized interaction.
When you reduce education to a transactional relationship and start treating learning as a commodity, you risk turning education into a customer-service problem for AI to solve instead of a public good for society. Making my education my way might remove certain friction for some students to achieve academic success but at an immense cost. We’re already siloed by algorithms in social media, do we really want to create silos in education, too?
What Harper argued in his Atlantic essay is coming true in education more rapidly than other industries, and many are seemingly unaware of the implications agentic AI will have.
The new AI products coming to market are gate-crashing spheres of activity that were previously the sole province of human beings. Responding to these often disturbing developments requires a principled way of disentangling uses of AI that are legitimately beneficial and prosocial from those that threaten to atrophy our life skills and independence. And that requires us to have a clear idea of what makes human beings human in the first place.
Education should be liberatory. This looks like entrapment.
Its just part of their effort to replace humans for everything