I agree with all of this. There are efforts to promote "AI literacy" but most (all?) of them frame that as "how to get the most out of this cool new thing!" rather than "let's think critically about whether and how to use this this tool at all." I'm working on starting an organization devoted to the latter; let me know if you'd be interested in chatting more about this.
The problem as I see it from the vantage point of a classroom teacher is that administrators are so overwhelmed with post-pandemic issues that AI literacy is nowhere near the top of their priorities. The majority of decision makers who oversee curriculum and are in charge of distributing funds are themselves already so far behind the curve. Consequently, schools will be at the mercy of edtech companies as the driving force pitching AI "solutions" to problems they may not even know they have. On the other hand, schools are so notoriously poor at using technology to innovate their teaching practices (as opposed to simply replicating traditional teaching methods with technology - big difference) that hopefully most schools will think twice before allocating scarce resources to unproven AI products. But I do think as AI continues to improve and it becomes more difficult for schools to insulate themselves from its implications, educators will need to emphatically and consistently remind technocrats to evaluate what is gained and lost by outsourcing teaching and learning to AI. Like every other disruptive technology that upends the job market, workers must persuasively make the case - to parents, politicians, policy makers, and yes, students - of their value. I doubt teachers will be the exception. The temptation to do more with less and cut costs has almost always been the ultimate arbiter of decision-making.
It's certainly a fear I have. We do seem to be on the receiving end of these tools far more than other industries. I agree that we move at a snail's pace and that's likely going to be thee greatest threat to our ability to respond.
I agree with all of this. There are efforts to promote "AI literacy" but most (all?) of them frame that as "how to get the most out of this cool new thing!" rather than "let's think critically about whether and how to use this this tool at all." I'm working on starting an organization devoted to the latter; let me know if you'd be interested in chatting more about this.
The problem as I see it from the vantage point of a classroom teacher is that administrators are so overwhelmed with post-pandemic issues that AI literacy is nowhere near the top of their priorities. The majority of decision makers who oversee curriculum and are in charge of distributing funds are themselves already so far behind the curve. Consequently, schools will be at the mercy of edtech companies as the driving force pitching AI "solutions" to problems they may not even know they have. On the other hand, schools are so notoriously poor at using technology to innovate their teaching practices (as opposed to simply replicating traditional teaching methods with technology - big difference) that hopefully most schools will think twice before allocating scarce resources to unproven AI products. But I do think as AI continues to improve and it becomes more difficult for schools to insulate themselves from its implications, educators will need to emphatically and consistently remind technocrats to evaluate what is gained and lost by outsourcing teaching and learning to AI. Like every other disruptive technology that upends the job market, workers must persuasively make the case - to parents, politicians, policy makers, and yes, students - of their value. I doubt teachers will be the exception. The temptation to do more with less and cut costs has almost always been the ultimate arbiter of decision-making.
It's certainly a fear I have. We do seem to be on the receiving end of these tools far more than other industries. I agree that we move at a snail's pace and that's likely going to be thee greatest threat to our ability to respond.