Agreed. As an adjunct, I'm not required to attend regular PD sessions...which is baffling in any era. I've pushed myself to attend AI-related PD sessions and to follow folks like you, but I also would like to be fairly compensated for the energy I invest in developing new materials related to AI. At some universities and community colleges, adjuncts occupy a large share of assigned positions and are not required/or paid to attend meetings or participate in these kinds of discussions. How this can be considered professional is beyond me... Thanks for sharing your thoughts here.
I think this is such an important point. Adjuncts have been on the frontlines of having to navigate challenges with AI in the classroom from the outset, and it's unbelievable that they get so little compensation for continuing professional development. Relying on (or exploiting) adjuncts' dedication to students seems to be the business model. (Obviously this is an industry-wide issue; I love the post itself.)
One of the concerns I have right now is that even if you magically waved a wand and every teacher and professor out there committed to address Critical AI Literacy or AI Fluency in their classroom, two things would happen: [a] there would be overlap in which many students were encountering roughly the same thing in different classes and [b] there would also be contradiction, with different classrooms pushing students in different directions around AI norms—a contradiction made worse, perhaps, by the investment teachers are making ad hoc in "what is right" for their classroom right now that is difficult to walk back.
I don't know what the solution is, particularly given the fast-evolving nature of AI, but I'm quite skeptical that everyone charting their own path for their courses is a sustainable, equitable solution for students.
I totally agree! The challenge is the norm in higher education is to allow faculty to decide what tools/ resources are appropriate, not having admin or any other organization dictate that. A few examples are outliers, like IT security awareness training and standardizing security, but that's mostly it.
Marc, thank you for these resources! One quick but important note: the SIFT AI tool only works on very specific versions of ChatGPT and Claude. I used it on a paid but not o3 ChatGPT and the results were full of hallucinations and, well, just really poor in general. I haven't been able to try the tool out on one of the recommended genAI versions, since they're not available to me through my institution (as is the case, I suspect, for most instructors). Mike Caulfield mentioned that apparently I was far from the only one to make this mistake - so it's worth emphasizing the importance of which genAI you're using.
I appreciate the clarification, Elina. I've been using Mike's SIFT tool kit in Gemini as a Custom Gem with no issues, but I am on the pro plan so that might be something to consider. https://g.co/gemini/share/3ad50c7a8bc4
Hi Marc. Thank you for this contribution. Your definitions of AI literacy and AI fluency and the clear, concise statements you wrote for each have resonated with me. May I have your permission to adapt the definitions and statements with attribution to you? I'm thinking they may be useful in starting a conversation about a simple way to measure progress towards AI literacy and AI fluency among faculty and staff.
Thanks so much for focusing our attention on the time it takes to integrate AI fluency and Critical Literacy strategies, Marc. On top of the class time you mention, there's also the time and energy for preparing new assessments and activities. I hear this question you mention from faculty a lot: "How am I supposed to find the time or resources to simply keep up to date about AI to understand its potential impact on my teaching?" And, on top of that is the fact that faculty are still exhausted from reworking their classes to be online during the pandemic--usually without compensation. I see very few schools compensating faculty for the time it takes to design AI-aware courses. Wouldn't it be great if they got paid for that time??
I've been waiting for a post about lack of time but a couple of points off the top of my head - one, the notion that teachers will have the kind of time to conduct every writing assignment with blue books and in-class seems impossible; two, have we also forgotten about the amount of time teachers spend chasing down AI plagiarism through detection, process tracking, or other burdensome administrative requirements?; and three, I don't see how 15 minutes a week of "discrete" AI literacy/fluency is going to cut it - AI awareness needs to be built into the fabric of any class where serious reading, writing, and knowledge production is done - the recent piece in The Chronicle of Higher Ed summarizes just how much student AI use is happening and for so many different things. Putting AI to the side reminds me of all the DEI initiatives we've seen over the past 5 years - it sends a message that it's something to be "dealt" with until we get back to the "real" work of teaching rather than the ongoing crisis it will continue to be. I get that teachers are overwhelmed but this is the new reality.
What if the alternative to current assessment practices was not "alternative assessments" on any number of tracks, but no assessments in the traditional sense - by which I mean no assessments that lead to a grade somewhere down the line. Radical? Yes, but actually possible. It also ties directly to the time issues you mentioned. In most discussions about "not enough time to add in X, Y or Z", we are talking about teacher time. That's actually quite different from the time it takes students to reach specified learning goals because that's never the same as the arbitrary time that teacher's set aside for it. If the time a teacher carves out for a concept was the same as the time it takes each individual student to fully grasp the concept, then there'd by no bell curve. It's not about finding more time -- it's about completely rethinking our notions of the time needed for learning for each learner. If we're not ultimately committed to learning for every student, then all the time in the world won't matter anyway.
I'm pretty sure the resistors aren't virtue signaling; they want to continue teaching their subjects as they believe they should be taught and expect the institution to make that possible. There have always been tensions over conduct, standards, workload, pay - but this strikes at the heart of the enterprise. If the college only admitted students who were willing and able to learn the material the way some want to teach it, we'd go out of business. Technology *may* eventually be able to adequately secure the online learning environment against AI influence without destroying the teacher-student relationship, but it isn't now.
The best institutions can do is to be transparent, collaborative, and supportive. Faculty cannot be rallied, inspired, cajoled or required to change - just helped. Recruit and reward the willing. Constantly work to raise awareness/literacy. Find parts of the curriculum that can be adjusted with minimal blowback. Work for structural change beyond the individual institution. And be kind - this isn't what (almost) any of us signed up for, it is just where we find ourselves.
Agreed. As an adjunct, I'm not required to attend regular PD sessions...which is baffling in any era. I've pushed myself to attend AI-related PD sessions and to follow folks like you, but I also would like to be fairly compensated for the energy I invest in developing new materials related to AI. At some universities and community colleges, adjuncts occupy a large share of assigned positions and are not required/or paid to attend meetings or participate in these kinds of discussions. How this can be considered professional is beyond me... Thanks for sharing your thoughts here.
I think this is such an important point. Adjuncts have been on the frontlines of having to navigate challenges with AI in the classroom from the outset, and it's unbelievable that they get so little compensation for continuing professional development. Relying on (or exploiting) adjuncts' dedication to students seems to be the business model. (Obviously this is an industry-wide issue; I love the post itself.)
One of the concerns I have right now is that even if you magically waved a wand and every teacher and professor out there committed to address Critical AI Literacy or AI Fluency in their classroom, two things would happen: [a] there would be overlap in which many students were encountering roughly the same thing in different classes and [b] there would also be contradiction, with different classrooms pushing students in different directions around AI norms—a contradiction made worse, perhaps, by the investment teachers are making ad hoc in "what is right" for their classroom right now that is difficult to walk back.
I don't know what the solution is, particularly given the fast-evolving nature of AI, but I'm quite skeptical that everyone charting their own path for their courses is a sustainable, equitable solution for students.
I totally agree! The challenge is the norm in higher education is to allow faculty to decide what tools/ resources are appropriate, not having admin or any other organization dictate that. A few examples are outliers, like IT security awareness training and standardizing security, but that's mostly it.
Marc, thank you for these resources! One quick but important note: the SIFT AI tool only works on very specific versions of ChatGPT and Claude. I used it on a paid but not o3 ChatGPT and the results were full of hallucinations and, well, just really poor in general. I haven't been able to try the tool out on one of the recommended genAI versions, since they're not available to me through my institution (as is the case, I suspect, for most instructors). Mike Caulfield mentioned that apparently I was far from the only one to make this mistake - so it's worth emphasizing the importance of which genAI you're using.
I appreciate the clarification, Elina. I've been using Mike's SIFT tool kit in Gemini as a Custom Gem with no issues, but I am on the pro plan so that might be something to consider. https://g.co/gemini/share/3ad50c7a8bc4
Hi Marc. Thank you for this contribution. Your definitions of AI literacy and AI fluency and the clear, concise statements you wrote for each have resonated with me. May I have your permission to adapt the definitions and statements with attribution to you? I'm thinking they may be useful in starting a conversation about a simple way to measure progress towards AI literacy and AI fluency among faculty and staff.
You can adapt it, Michelle.
Thank you, Marc.
Thanks so much for focusing our attention on the time it takes to integrate AI fluency and Critical Literacy strategies, Marc. On top of the class time you mention, there's also the time and energy for preparing new assessments and activities. I hear this question you mention from faculty a lot: "How am I supposed to find the time or resources to simply keep up to date about AI to understand its potential impact on my teaching?" And, on top of that is the fact that faculty are still exhausted from reworking their classes to be online during the pandemic--usually without compensation. I see very few schools compensating faculty for the time it takes to design AI-aware courses. Wouldn't it be great if they got paid for that time??
I've been waiting for a post about lack of time but a couple of points off the top of my head - one, the notion that teachers will have the kind of time to conduct every writing assignment with blue books and in-class seems impossible; two, have we also forgotten about the amount of time teachers spend chasing down AI plagiarism through detection, process tracking, or other burdensome administrative requirements?; and three, I don't see how 15 minutes a week of "discrete" AI literacy/fluency is going to cut it - AI awareness needs to be built into the fabric of any class where serious reading, writing, and knowledge production is done - the recent piece in The Chronicle of Higher Ed summarizes just how much student AI use is happening and for so many different things. Putting AI to the side reminds me of all the DEI initiatives we've seen over the past 5 years - it sends a message that it's something to be "dealt" with until we get back to the "real" work of teaching rather than the ongoing crisis it will continue to be. I get that teachers are overwhelmed but this is the new reality.
What if the alternative to current assessment practices was not "alternative assessments" on any number of tracks, but no assessments in the traditional sense - by which I mean no assessments that lead to a grade somewhere down the line. Radical? Yes, but actually possible. It also ties directly to the time issues you mentioned. In most discussions about "not enough time to add in X, Y or Z", we are talking about teacher time. That's actually quite different from the time it takes students to reach specified learning goals because that's never the same as the arbitrary time that teacher's set aside for it. If the time a teacher carves out for a concept was the same as the time it takes each individual student to fully grasp the concept, then there'd by no bell curve. It's not about finding more time -- it's about completely rethinking our notions of the time needed for learning for each learner. If we're not ultimately committed to learning for every student, then all the time in the world won't matter anyway.
I'm pretty sure the resistors aren't virtue signaling; they want to continue teaching their subjects as they believe they should be taught and expect the institution to make that possible. There have always been tensions over conduct, standards, workload, pay - but this strikes at the heart of the enterprise. If the college only admitted students who were willing and able to learn the material the way some want to teach it, we'd go out of business. Technology *may* eventually be able to adequately secure the online learning environment against AI influence without destroying the teacher-student relationship, but it isn't now.
The best institutions can do is to be transparent, collaborative, and supportive. Faculty cannot be rallied, inspired, cajoled or required to change - just helped. Recruit and reward the willing. Constantly work to raise awareness/literacy. Find parts of the curriculum that can be adjusted with minimal blowback. Work for structural change beyond the individual institution. And be kind - this isn't what (almost) any of us signed up for, it is just where we find ourselves.
Incredibly helpful—thanks!