It is worthwhile spending some time on social media to see the relentless bombardment faced by students from influencers peddling AI tools that straddle the line between aiding study and blatantly enabling cheating. These influencers flourish is contradictions and promise the moon: complete your homework in five minutes flat, forget about ever attending another lecture, let AI take up the pen for you. In each pitch, the essence of learning is overshadowed by a pervasive call to save time. Welcome to the dizzying world of AI influencer culture, where the pursuit of profit drives companies to use influencers as direct conduits to push their products onto students.
Most of these ads use attractive, white female social media influencers who might be the age of an undergraduate student to gain the attention of users scrolling quickly on TikTok and Instagram. What fascinates me is that a few of the ads I've seen talk about ChatGPT. Most are third-party apps that use OpenAI's API to build their product, and only a few are directly geared toward writing. Many are using augmented speech recognition to automate lectures, like Turbolearn, retrieval augmented generation to automate reading, like Unriddle.AI, and what looks like a version of GPT-Vision to take pictures of math and science problems with Answers. AI.
Equally unnerving is the discourse that is emerging among students about the use of these features for their studies in the context of academic integrity. Namely, many don't see offloading coursework as cheating but as a new way to learn. Indeed, in some of these ads and in many of the comments, we see students offering familiar complaints—I had to use this app to help me in classes because my teacher was so terrible at teaching that I couldn't learn!
Enable 3rd party cookies or use another browser
I don't know what's more absurd within the ad—the influencer claiming they turned to the AI to help with school work after a concussion or blaming her teacher for not being able to teach. I hope Ashley recovers from her concussion shortly and is able to resume her coursework, but it looks like it hasn't impacted her ability to produce content and promote it online to her audience.
Teachers Do It Too
Let's be clear about something up front, it isn't only students making questionable decisions about how they're engaging AI in their studies. Out of frustration, many educators have turned to novel ways to try and catch students' generative AI use and are eager to share this on social media. Notably, one of these is prompt poisoning, or as described in the video below, "placing a trojan horse in your essay prompt to catch AI cheating."
Please don't do this. Putting a trojan horse in an essay prompt or trying to hide material to later catch a student is a form of deceptive assessment, and it is not a practice I condone, nor will many academic integrity officers. Not only is it unethical, but there are students who use immersive readers and text-to-speech software that this hidden trojan horse won't work on. Doing so will only serve to sow confusion in the classroom and further erode the relationship between teacher and student.
I can tell you from my student days that I often copied the prompt onto my word.doc and used it to guide me. I would change the font when formatting my entire essay. It would horrify me to find my professor clandestinely inserting some gibberish into an essay assignment guidelines to try and catch me cheating with AI. If I were 18 and discovered it, I likely wouldn’t know any better, and go ahead and include the words in my response. To be sure, people are becoming increasingly frustrated, but this isn't the path we want to chart.
Never Listen to a Lecture Again
The lack of responsible messaging around many of these ads is alarming but predictable. Instead of talking about the potential benefits that using an augmented speech recognition system to record a lecture might have for a student with differentiated learning needs, disabilities, or helping users process and put into context often overwhelming amounts of information, we instead get the lazy sales pitch:
If you grew up in the 1980s, you likely saw Val Kilmer in Real Genius do some pretty futuristic things with lasers and popcorn, but you may not recall the absurdity the tape recorder montage scene captured. Even back in 1985, the idea of recording a lecture and checking out wasn't a new idea.
What underscores both Turbolearn's ad and Real Genesis's montage is a feeling I've witnessed in many students about how little care or attention they feel their professors gives them during a lecture. From a student's perspective, their job is to sit passively like sponges and absorb a torrent of information, often rushed and sometimes even incoherently given. Why not offload that to AI and have the system automatically transcribe the lecture, summarize it, and even synthesize it?
It's increasingly evident that when traditional educational practices—like lecturing, reading, writing, or evaluating—fall short in effectively serving students, developers will identify opportunities to apply generative tools. These tools are likely to attract an enthusiastic audience ready for adoption. Here's a montage of social media ads with a sampling of some of these use cases.
Get all Your Home Work Done in Five Minutes
If My Professor is Watching. . . This is a Joke
Let AI Read for You
Enable 3rd party cookies or use another browser
How to Defeat AI Detection
Using Snapchat’s AI to Cheat
Enable 3rd party cookies or use another browser
What Does Being in College Mean?
Many educators decided to move assessments to in-person exams to protect them from generative AI, and it's clear that isn't going to be an effective long-term solution since OpenAI gave developers the opportunity to build third-party apps using multimodal features. Transformers are, by definition, models of language. There are few questions such models cannot effectively synthesize given proper instruction. The idea of AI-proofing learning, let alone assessments, is increasingly becoming untenable.
The pain point here is students don't view the work as core to their learning—simply a boring task to complete before being handed a degree. Assigning assignments in LMSs that appear as a dazzling series of to-lists to mark off before doing something that isn’t sitting in front of a screen. That's why these hooks of "here's a quick tip" or "how to hack reading" resonate so well with students.
We need to do a better job of advocating for learning. Much of the marketing in higher education is geared toward selling college as a lifestyle—multimillion-dollar dorms, gyms that rival professional sports teams training complexes, dining options galore, along with a myriad of social scenes, clubs, and Greek life. Learning is often mentioned in many of these marketing ads as filler or background noise.
There's no mechanism to effectively stop students or professors or anyone from offloading what was, until very recently, human tasks to an AI. It's time we shift from looking at assignments and assessments and start thinking about the core reasons why we teach and the core reasons we hope students arrive on our campuses ready and eager to learn.
"I've not got an hour to watch this video" - he's far too busy making TikToks
Lots to unpack here. From my very specific vantage point as an independent HS teacher, we are not seeing this (yet), at least not in my school or most of our peer schools. I'm going to go out on a limb and say some of this is the (understandable) fault of colleges for not tackling the issue head on - the absolute crickets at most educational institutions on AI use provides a gigantic hole for students to crash through - how are individual teachers supposed to completely upend decades of educational practice overnight in the absence of any guidance by administrators? The Stanford study (https://www.nytimes.com/2023/12/13/technology/chatbot-cheating-schools-students.html) from earlier this year seemed to suggest that cheating has not changed much as a result of AI. Time will tell how accurate that is and whether the trend holds true. But I think the larger issue is the glut of AI products saturating the marketplace making it impossible for the average educator, let alone administrator, to figure out which ones might be worthwhile. Enterprising and resourceful students are going to be in the driver's seat until schools step up and have an honest conversation about all the pedagogical implications of genAI and I'm just not seeing it.