Last week, Jefferey Young from Edsurge published a podcast episode When the Teaching Assistant is an ‘AI’ Twin of the Professor with an interview from me where I pushed back on this emerging trend of educators uploading their own writing to chatbots to create a “digital twin” for students to interact with.
The episode centers on Paul Jurcys, cofounder of Prifina, and his speak-to-ai service. I’ve written before about what’s at stake when we allow synthetic relationships to alter real ones and what strikes me is how bizarre it is we’re even having this conversation.
It is 2024, and in all my ideas about the future, I don’t think having a machine modeling language was one of them. Nor did I imagine becoming a teacher to one day be presented with the opportunity to upload my writing to create a digital twin using generative AI. But that’s our reality.
Programmable Chatbots Cometh
Yesterday, Blackboard introduced an AI chat interface as a gradable assignment directly into the LMS.
The tool is called AI Conversations. It’s a timed interaction where a student converses with a bot over a specific topic chosen by the professor, which then gives them space to reflect on what they learned from the chat. The faculty member then gets a full transcript of the session.
I can certainly see some students using it to gain more insight into general topics, but it also feels like I’m watching a supervisor pull chat logs from customer service agents and rating how well the human conversed with a machine. This all feels like an increasingly flipped reality, one where our interactions with technology become the point of education.
Our Reality is Changing
How much has our world changed since you’ve been alive? I’m in my early 40s and recall a time when I could just hang out growing up. The biggest technological distractions were television and the record store. I can’t imagine how many hours I spent idling away with friends.
My maternal grandfather was born in 1931 amidst the great depression. He grew up on a farm in a small community in central Missouri near Cheese Creek outside of Cole Camp. A place you could hardly call a hamlet next to one of the tiniest towns you’ve seen. They didn’t own a car or have electricity. If you wanted water you had to walk outside to pump a well and haul a bucket back inside.
What he witnessed in the first 40 years of his life from 1931 to 1971 must have been astounding. He lived through the Second World War, the arrival of the nuclear age, jet-powered air travel, computers, fast food, color television, and the moon landing. I can’t imagine going from a world without indoor plumbing to living a few miles from a Minuteman III missile silo in just forty years, but he did just that.
Technology Cannot Cure Loneliness
What have we witnessed in the past four decades? We’ve never been more connected than we are by technology and are never more isolated, lonely, and depressed because of it. Our students are struggling more now than before the pandemic to make relationships in the real world, come to class, and simply be present. Gone is the art of hanging out. Certain developers are here to present generative tools as mechanisms to improve how we communicate.
Alongside digital replicas, we’re increasingly seeing tools developed to help us communicate in real life. Last week, Microsoft launched Speaker Progress to provide students with feedback in real-time. It used audio and visual AI to listen and see a student speak before using generative tools to give them feedback. Speaker Progress can give users advice on the tone and pitch of their delivery, the style and substance of their words, and even suggestions on improving their posture. Do we really want an algorithm to tell young people to sit up straight?
My grandfather, your grandfather, everyone’s grandfather for that matter, is likely one of the voices we hear in the back of our heads each time we slouch and remind ourselves to improve our posture. It’s comical, but that’s what makes it so human. Having a machine tell you to sit up straight so other people will take your message more seriously when you talk to them leaves me gobsmacked. Why did they think that’s a good message for a machine to deliver to a human being?
Sure, using Speaker Progress could help a student learn how to deliver a message. Programming a language model with your writing could likewise give students faster answers to questions they may have. The rub is, why aren’t we building things that encourage human connection? Why are we investing in technology to offload communication and feedback to machines?
We should want students to ask us questions because doing so does more than simply convey information—it helps cultivate awareness, understanding, empathy, and dozens of other human qualities that an algorithm never could. A machine can give you feedback to help you speak, but it should never be a replacement for talking to someone.
The AI Deployment Cycle is Exhausting
You might find this essay rambling and incoherent. Good. That’s because I’m writing it as a human being, not as a language model. And this human is increasingly tired of seeing technology aimed at improving human behaviors with no sense of the consequences.
Many will quibble and say I’m not leaving room for nuance—that using this technology to mimic those skills in the right way can improve those human qualities, making us better communicators, more engaged, and more present. Gone are the days of one-room schoolhouses of our grandfather's era. We should embrace the tools of the modern world and find new ways to learn new ways to think, and new ways to use this technology to make us more than we are.
If generative AI remains simply a tool, then I concede this could be entirely an overreaction on my part. But remember, the people developing the current AI models see generative AI as a stepping stone on the path to true artificial general intelligence. The countless billions being poured into development aren’t for better copilots or assistants, but to develop breakthroughs in the technology for full-reasoning synthetic intelligence.
The past 21 months since ChatGPT was released should be a wake-up call that we need to prepare for a future we might struggle to comprehend. Part of that calls for establishing baselines for ways we do and don’t want machine assistance in our lives. That needs to happen now, not a few months or years from now. For education, that’s going to involve advocating for maintaining human relationships and giving students opportunities to learn outside of screens.
It's disheartening to see how, in the name of efficiency, many aspects of our lives are being made programmable to maximize their potential. Conversations and learning experiences are reduced to mere exchanges of information and instructions, with no deeper meaning. Human communication, in all its richness, is seen as the messiest way of delivering information—precisely because of how human it is.
What I find most alarming about the development of these kinds of products is that, even though they offer minimal value, they are still cheaper than paying a teacher or tutor to guide a student. The "service" provided by this technology is poor, yet efficient and cost-effective in the most inhumane way possible, making it scalable and deployable in impoverished or disenfranchised areas. This process ultimately turns human connection into a luxury that only a few can afford. Automation for the many, and human connection for the few.
You make a clear and valid point that I wholeheartedly agree with. I find the suggestion of creating my twin - even if it was "only" an ElevenLabs voice twin - repulsive, much more a twin to teach what I teach. Having said that, I work in a privileged position as a language teacher at a university, with small courses and motivated students whom I can make comfortable to speak and explore the English they are learning. But most of the lectures and even the seminars in the subjects at German universities are attended by hundreds of students at a time. None of them get the chance to regularly ask their lecturers questions or receive feedback on what they might want to say or write, so I can see the attraction of providing a "twin" which would be available to all students. There are no study fees at normal universities in Germany, and private universities are already expensive, but only very few have a really good reputation, so buying a tutor is not in the books (for now). If the AI tutor is subject oriented and offers students the opportunity to engage with the subject in an additional way, I do think this might actually do some good.
And let's not think that if we oppose this development there will be more teachers. There won't. Here in Germany, nobody even wants to become a (school) teacher anymore. At the moment, university jobs are still quite coveted, but the stress level is really high, too.
To me it seems that we need to take action to channel this new technology into a sustainable and helpful direction. We therefore need studies that address how, where and to what end AI tutors are used and explore what happens when students engage with them. We should make students and administrators aware that they may be useful like other tools (e.g. a spell checker), but they will never replace a human teacher because a great part of learning is dependent on human / social interaction.