There is a growing divide among those who use AI and those who do not. More subtly, it appears K12 may be splintering from higher education along these lines. The Walton Foundation conducted a national survey of over 2000 K12 teachers about their AI usage through Gallup and the results are eye-opening: “Teachers who use AI tools at least weekly save an average of 5.9 hours per week — amounting to six weeks over the course of the school year.” The report refers to this as an AI dividend and frames it as a net positive for overworked teachers. Here is a technology that gives time back to people, making their jobs more efficient and their tasks less burdensome.
But I’m not rejoicing. In fact, I’m extremely worried about what this news means for labor throughout education. When you embrace a technology that automates tasks to save time, you allow AI to reshape your working conditions. This opens the door for future automation and possible devaluation of your skills. Few industries will allow an employee to gain six weeks of time without adjusting their workload by increasing the tasks they need to perform or cutting their wages in response.
There’s something rhetorically off-putting about the language used throughout this report that feels at odds with the subject of education. It’s transactional in nature. The words “returns,” “savings,” “gains,” and “impact” appear repeatedly throughout the report. Even the title Unlocking Six Weeks a Year With AI sounds like a technology ad promising an amazing return on investment, not an educational report about pedagogy or student growth.
I’m in no way opposed to using AI, but shouldn’t the goal be more than simply saving us time on one task with the assumption we will use that ‘dividend’ to complete another or do more meaningful work? When you teach, you’re exhausted from one day to the next and try to carve out what little space you have to catch your breath before it all starts over again the next day. The idea teachers would put more effort into their job because they saved some time elsewhere is disconnected from the lived existence of many teachers I know.
When we get into values-based arguments about what should or should not be automated in our jobs, we invite others to chime in with their opinions as well. For public school teachers, those others include state legislative bodies that provide funding. My fear is our politicized environment rife with DOGEified rhetoric about efficiency will take this news and ask “How do we ensure we aren’t paying for those six weeks?”
An AI Dividend Reveals a Deep Divide
What is mind blowing about the report is the incredible disconnect the general public at large has between AI usage among faculty in higher education vs. teachers in K12. Kashmir Hill’s recent story The Professors Are Using ChatGPT, and Some Students Aren’t Happy About It ushered in a huge backlash against contingent college faculty members for using AI. The general public clearly thinks college-level faculty shouldn’t be using these tools with their students.
If college faculty claimed their usage of AI saved them six weeks’ worth of work a year, the internet would drag them endlessly. K12 teachers get a pass here, at least for the moment. There’s an odd silence in reporting how vast generative AI’s overall impact has been on K12 education and I’m not sure the general public will view teachers in the same light as professors. For one, there’s an assumption that college faculty are uniformly well paid compared to school teachers. They aren’t. Nearly 70% of faculty who teach in higher education hold contingent appointments. If you’re lucky, you might make $3,000 or $4000 per class. Don’t expect a full load of courses, either.
The silence around K12 teachers' AI use won't last forever. Eventually, the same scrutiny that college faculty face will reach elementary, middle school, and high school classrooms. But by then, the efficiency narrative will be so entrenched that questioning AI use will seem like questioning progress itself. Teachers have a narrow window to shape this conversation while they still can—before the conversation shapes them.
On that note, I think it is worthwhile to look at Anthropic’s recently announced Economic Futures Program and see if there is merit in exploring ways to curb AI’s impact on labor and what it means to work when a machine can automate much of the tasks you do each day. Anthropic is one of the few AI companies talking openly about the expectations they think their technology will have on the economy. Educators should have a seat at these discussions because our sector is on the frontlines of all things generative AI.
AI Should Be More Than a Time Saver
If you are a teacher reading this, you might be wondering what’s going on, because you haven’t felt like you are saving six hours of work each week, even if you dabbled in AI. That may be because of how the Walton Foundation calculated time on tasks. There are some pretty generous numbers being placed on things like making worksheets or modifying assignments vs more time-intensive tasks like providing detailed feedback for student assignments.
Still, the report clearly identifies people who use AI frequently as feeling like they’ve gained time using AI. Per the report, this is only around 30% of teachers, while around 60% of the K12 educators who responded to the report indicated they used AI at least monthly to save some time. While 40% of teachers haven’t used AI. That’s quite a divide. When three out of ten of your employees are using a tool to save them nearly a full day's work each week, the remaining employees are likely to raise some questions about fairness, pay equity, and what this might mean for current or future contract negotiations.
What teachers do with the time they save isn’t exactly clear from the report. We get this bit here with a little further evidence:
Qualitative data from the survey show that teachers use the time they save with AI on things like providing more nuanced student feedback, creating individualized lessons, writing emails to parents, and getting home to their families at a more reasonable time.
I don’t see the same population of super users who reported using AI to create lesson plans using the time saved too. . . create more individualized lessons. Outside of being able to get home to their families sooner, there appears to be a significant disconnect with how the report perceives how people treat automation in their work.
I’m an avid woodworker, but I don’t use my power tools to then spend more time on a project with hand tools. I do so in order to get things done quickly and finish a task. I never think to myself gee, using that tool freed me up to spend more time doing the next project. That’s why I’m deeply skeptical that any educator will use time from an AI dividend to “reinvest in their classroom.” Teaching isn’t day trading and classrooms aren’t an investment portfolio.
Questions to Ask Before Automating a Task
What we need right now is a principled take on where the line might be in using AI—a reality check that behaviors we wouldn’t let someone do in our place before AI perhaps shouldn’t be done by an algorithm just because it is more convenient.
Here’s a simple start. Before using generative AI in your classroom or teaching practice, pause and ask:
Will this AI use change the authenticity of my relationship with students or parents?
Am I using this to save time or because I believe it genuinely improves student outcomes?
Could this create grounds for my district to increase my workload or justify job cuts?
What skills or knowledges might atrophy if I automate this task?
I don’t think any of these questions are easy to answer. I imagine each of us would give different responses to a variety of circumstances and those likely wouldn’t mean your choices are set in stone. Good. That means you are actively using your agency to make those choices based on the full spectrum of human judgment. As Weizenbaum said deciding is a computational activity, something that can ultimately be programmed. Choice, however, is the product of judgement, not calculation.
Using AI Should Be Intentional
Something profound is happening right now and so few people are paying attention to it. Generative AI is beginning to reshape our habits and despite the steady derision of people claiming outputs as AI slop, gleeful posting on social media when an AI model gets something wrong, and clear ethical and environmental challenges, people are steadily finding use cases for the technology in their daily lives that they find meaningful to their situation. Labor is one of them.
I’m not judging teachers for using AI to gain back some part of their daily life from challenging jobs, but would the same be true if the story was about law enforcement using AI to automate police reports, doctors using AI to draft patient history, judges or lawyers using the technology to argue cases, or the person who matches you on a dating app and uses a chatbot to talk with you instead of their own human language? None of these are hypothetical what-ifs. They’re happening right now and we’re only hearing about it in bits and pieces as stories pop up in the press.
Calling time savings an 'AI dividend' reveals everything wrong with how we're thinking about this technology. Dividends go to shareholders, not workers. Real gains for teachers would look like smaller class sizes, better support, or actual schedule flexibility—not just more efficient ways to complete the same overwhelming tasks that form unrelenting workloads. Until we can tell the difference, we're optimizing for efficiency in a system that was never designed to serve teachers' needs.
I think you're a little late to the party here in some respects - of course, many professionals have been using AI for all the things you mention for awhile (and what tends to go unremarked by most of us in the humanities is the way in which AI has significantly impacted coding tasks and jobs) and many, many more. Why should teachers be any different? But it's a relatively small number. I don't see it so much as time saved as increased productivity - an adept user of AI can accomplish much more if they are willing. The K-12 space requires much more supervision than on college campuses. We also have generally more contact with our students and, ideally, have a little more of an impact on more impressionable minds. But, Marc, who is going to set the rules? Administrators? Most know less than the teachers and the students. Most districts I am aware of are moving towards AI and not away from it. They feel a need to engage. Not sure how many average folks are aware of the vigorous debate happening in the AI space currently.
I like this post. I like it because we actually should be alarmed if the big-tech media machine is pushing the benefits of automating various aspects of teaching and instruction without scrutiny, so I appreciate the scrutiny. But despite my involvement as a labor activist, I think that there are better arguments for being alarmed than protecting instructional jobs and ensuring humane working conditions for instructors, goals I take very seriously in their own right. We should also be concerned because automation is not good for students and is not going to work in the long term.
The survey responses that concern me the most are about time saved on: grading or giving feedback, modifying materials to meet student needs, and providing one-on-one instruction or tutoring. To the degree that teachers are saving time on grading and feedback… I get it, this is time-consuming, sometimes laborious work, so it’s tempting to offload/streamline that aspect of the job. In fact, many of us cut and paste routine feedback, but the fact that we actively read our students’ work first is profoundly important. Student work often includes important personal details and revelations. For example, imagine a student's essay about struggling with suicidal ideation. To simply correct the grammar and reorganize the sentence structure would be incredibly irresponsible. And, while that is an extreme example, there are less extreme examples that also matter, such as appreciating a student’s humor. Even though an instructor may cut and paste many comments, it’s extremely important that we witness our students’ work and adjust as needed. Without that, with mere automation, the comments lose value and the whole educational project is in jeopardy.