From Panic to Reality: How Generative AI is Actually Impacting Student Writing
The solution to the Great Disengagement isn't a techno one
I won’t bury the lead—many of my students aren’t using ChatGPT or other generative AI tools in numbers or ways that genuinely threaten academic integrity or warrant the panic we’ve been seeing. This is after allowing them to explore the technology in structured assignments and freely within their own writing. It has been two full semesters since I actively allowed and encouraged students to use generative AI and close to 80,000 words of students reflecting on their use of generative AI tools has offered me a snapshot into how first-year writing students use and view generative AI tools like ChatGPT and GPT-3 powered writing assistants. Most students see some benefits in using it for specific use cases, but don’t want to use it for most of their writing or seem eager to explore its possibilities.
Wild, isn’t it? The doomer version of scores of students scaling the walls of academia with AI generated papers in their mouths like knives to lead a coup against academic integrity isn’t happening, at least not yet. Nor is the much-hyped scenario of students embracing generative AI to save themselves time on drafting complicated assignments and bridge achievement gaps. It’s early days still and we’re bound to see more students adopt the technology, but their integration of the technology into their daily practice of writing isn’t inevitable and certainly isn’t going to match the speed and deployment of this technology.
When I first came across GPT-3 in May 2022, I jokingly thought “well, at least I won’t ever see an assignment that doesn’t at least meet the word count ever again.” It was a bit of gallows humor, because like many, I took a doomer approach to generative AI’s impact on learning in those early days. Sure, I’d see essays filled with hallucinated material, invented facts, coherent and even convincingly worded bullshit, but at the very least I’d see full essays. But that hasn’t happened. I’m still getting students submitting incomplete drafts, even after I’ve shown them how to use the technology, given prompt workshops, templates, and dozens of demonstrations. What’s even wilder is the number of students not completing full drafts has remained pretty consistent, too.
The Great Disengagement is upon us. It would have arrived regardless of Covid, but the pandemic absolutely accelerated it. The vast majority of students are okay. They’re doing the work, they’re talking, they’re learning, but 15-20% have checked out to an alarming degree. Generative AI isn’t a tool that’s going to bring those students back to the table, at least not in itself. Perhaps teachers using GAI could reach some of those students, but the investment in training faculty using innovative technology that’s so far unproven to help learning is a roll-of-the-dice for traditionally conservative institutions of higher learning.
Student Engagement with AI for Spring 2023
Analysis Assignment 15%
AI Reading Assistant Assignment 25%
Brainstorming Synthesis Assignment 23%
Synthesis Assignment 25%
Brainstorming Research Questions Assignment 30%
These percentages represent students reflecting on their AI use in my courses for specific assignment uses. The vast majority chose not to use it. Many of those students who reflected on AI see benefits and enjoy using the tech, while many others reflected on reasons they didn’t want to use AI. It’s too new, too alien, too frightening. Some fear using any type of AI-assistance, viewing it as a slippery slope into not thinking or losing skills.
Students are grappling with a lot and making thoughtful decisions on their use of generative AI. Most are cautious, far more than I think we give them credit for within the panic around generative AI. A substantial number of students reflected on AI without using it, e.g they know it exists, what it can do, but have yet to adopt it in any meaningful way. There’s a lot to unpack here. I think we all need to take into account some of the following.
Students have 12+ years of human teaching and feedback before college; they aren’t going to just drop that and embrace an algorithm. Students rely on their parents, their peers, and teachers far more than generative AI.
Students are not eager adopters of technology. The notion that students are digital natives isn’t reality for many of them. Just because they use frictionless experiences to engage one another on smart phones doesn’t mean they’re ready to leap into generating content for assignments.
Prompting takes a good deal of knowledge and practice to use ChatGPT and similar tools. Students need to actually have some idea what they want from a generative output before they start, and that takes a significant amount of rhetorical, content, and context knowledge in order to produce a viable result.
Once you talk to students about the limitations/benefits of using generative AI, a good number of them tap out and say no thanks. Many students don’t want to deal with the specter of wielding a tool that bullshits content. Several students expressed how anxiety inducing it is to have to carefully read and factcheck a convincing output generated by a supercomputer. That takes work. That takes time.
Saving time is an abstract concept to a young person, and time saved on course work may not mean much to a student if they don’t already have that combination of confidence and skills needed to complete an assignment without AI. Most students prefer to draft organically and not use generated text at the start of the writing process.
Many students fear being labeled as “the AI kid” in class and this brings a certain amount of pressure I never considered. Several students reflected on the fear of being labeled as essentially a cheat, or dumb by their peers and as a slacker by their professors for adopting AI.
Adoption may be low now, but I doubt it will remain that way for long. Nearly all of my students believe they're going to be asked to use some form of AI in their future careers. Students don’t fully understand how or why generative AI deployment is happening, or grasp all the forces at play, but they have an incredibly salient view of their future labor and fear what their employers will expect from them or pay for, given what generative AI can do.
Indeed, students get that having certain AI skills will likely make them more employable, savvy, competitive, but they largely balk at the perceived urgency some have called for them to pursue in acquiring those skills. This makes sense, as we don’t have any clue what those specific skills will be at this point. Given this, what level of guided assistance should educators provide students with acquiring AI-related skills and literacy?
Educators as Advocates
I’ve left AI-assistance as optional in my courses. I don’t think it is ethical nor reasonable to drag students into adopting a tool because some believe it will change the nature of work. We’re still very much so in the realm of experimental scholarship of teaching and learning with generative AI. It will likely impact what it means to be digitally literate, what it means to work, but we should take care that we’re not creating a self-fulfilling prophecy by forcing students down a path of adopting technology controlled by major tech companies. Educate students, raise awareness, encourage play and exploration, but let’s not mandate usage.
A Call for Applied AI in Teaching and Learning
If generative AI isn’t going to entice those 15-20% of students to engage in their course work, then we need to explore solutions where technology gives us a clearer picture where those students have disengaged with their learning. Applied AI can help monitor student engagement in ways humans cannot, but it isn’t a one stop solution to fixing the disengagement problem. We cannot simply implement a system where an algorithm sends out an email informing a student what they need to work on in order to succeed, especially if that student isn’t actively completing assignments, or even opening their email. We need to close the loop and ensure the timely intervention of a real-life human physically talking to the student. AI can augment student intervention in this manner, but we cannot outsource the humanity needed to actually reach out and make connections with those students in need.