Appreciate the shout-out and the questions here! The challenge is getting an audience with those making the decisions as well as selling the systems, but these are important practical questions. Regardless of your feelings about AI use in education, I think many institutions lack accountability and audit systems.
As long as "free" and reasonably powerful versions of AI tools exist, getting students to use AI in class on a sanctioned wrapper is, at best, an illusion of control. Ironically, Marc, where I've seen it work best is actually with younger students who are more inclined to follow the rules and don't really know how to use LLMs yet. I've seen with my own 6th grade daughter who worked on a thoughtfully designed exercise though Flint.AI that required her to go through the sequence of writing her short "essay" scaffolded with questions, examples, and analogies, that, as a teacher and father watching her in real time and observing both the questions and final output, produced a short piece of writing that she was really proud of. When I asked if she thought it helped her, she said it did and was useful and she was engaged and productive for about 30 minutes while working on it. I honestly don't know how to feel about all this!
There are a few responses I have to this interesting piece. First, I appreciate the work you are doing and the video from Norton. This is very important to discuss.
I think the issue of vendors on campus is one major issue. When you are talking about "Buying AI for your campus" you are sometimes talking about engaging with vendors. They don't know what they are talking about. Buying "AI" is just hype and we as academics need to see through the hype. We need to be "tool agnostic" and understand the entire field before just listening to a sales pitch.
But to say our students will just use ChatGPT based on a study from more than a year ago because that is all they have tried misses the entire point of education. The surveys assume even knowledge about how to use AI, what it does, what it can do, and a myriad of other points without asking whether anyone was trained on it's use. I have polled my students three weeks ago and our faculty on their actual use, knowledge, and perceptions of AI at our university. This is local data, not the national stats. That is the data I trust. When you or I train people on how to use AI and what it actually is, their understanding drastically changes. That is as true for college students as it is for high school students or their parents.
Back to your core question. Does your institution need to buy AI gets back to the point of democratization of knowledge. Students can have Gemini Pro for free. But not faculty. The vast difference between a paid account and a free account is staggering on any platform. Claude by Anthropic is the biggest game changer. Between Claude Code from Anthropic and Codex from ChatGPT, they really shifts what can be done with AI. Our administrators don't get that. It is only through training that people can make wise decisions. It is not through complaining about reps on campus.
Marc, really enjoyed the piece and completely agree that universities need to ask both themselves AND these AI companies the "why," before making any deals. And if the company says "I didn't know it could do that," it's a great reason to pause everything.
One other note, it's "Gonzaga University," not the "University of Gonzaga," just FYI.
The product I'd like to see us purchase would protect student data from agentic AI. Of course, it would inevitably derive from the models that created the threat, and when we start to evaluate alternate products we would be in danger of what anthropic euphemistically calls agentic misalignment (robotic extortion), and we'll lose the arms race when AI starts upgrading itself https://cset.georgetown.edu/publication/when-ai-builds-ai/...
The only upside I see to having human AI vendors on campus: at least we can see them coming (so they'll be less invasive than their products).
Appreciate the shout-out and the questions here! The challenge is getting an audience with those making the decisions as well as selling the systems, but these are important practical questions. Regardless of your feelings about AI use in education, I think many institutions lack accountability and audit systems.
As long as "free" and reasonably powerful versions of AI tools exist, getting students to use AI in class on a sanctioned wrapper is, at best, an illusion of control. Ironically, Marc, where I've seen it work best is actually with younger students who are more inclined to follow the rules and don't really know how to use LLMs yet. I've seen with my own 6th grade daughter who worked on a thoughtfully designed exercise though Flint.AI that required her to go through the sequence of writing her short "essay" scaffolded with questions, examples, and analogies, that, as a teacher and father watching her in real time and observing both the questions and final output, produced a short piece of writing that she was really proud of. When I asked if she thought it helped her, she said it did and was useful and she was engaged and productive for about 30 minutes while working on it. I honestly don't know how to feel about all this!
There are a few responses I have to this interesting piece. First, I appreciate the work you are doing and the video from Norton. This is very important to discuss.
I think the issue of vendors on campus is one major issue. When you are talking about "Buying AI for your campus" you are sometimes talking about engaging with vendors. They don't know what they are talking about. Buying "AI" is just hype and we as academics need to see through the hype. We need to be "tool agnostic" and understand the entire field before just listening to a sales pitch.
But to say our students will just use ChatGPT based on a study from more than a year ago because that is all they have tried misses the entire point of education. The surveys assume even knowledge about how to use AI, what it does, what it can do, and a myriad of other points without asking whether anyone was trained on it's use. I have polled my students three weeks ago and our faculty on their actual use, knowledge, and perceptions of AI at our university. This is local data, not the national stats. That is the data I trust. When you or I train people on how to use AI and what it actually is, their understanding drastically changes. That is as true for college students as it is for high school students or their parents.
Back to your core question. Does your institution need to buy AI gets back to the point of democratization of knowledge. Students can have Gemini Pro for free. But not faculty. The vast difference between a paid account and a free account is staggering on any platform. Claude by Anthropic is the biggest game changer. Between Claude Code from Anthropic and Codex from ChatGPT, they really shifts what can be done with AI. Our administrators don't get that. It is only through training that people can make wise decisions. It is not through complaining about reps on campus.
Marc, really enjoyed the piece and completely agree that universities need to ask both themselves AND these AI companies the "why," before making any deals. And if the company says "I didn't know it could do that," it's a great reason to pause everything.
One other note, it's "Gonzaga University," not the "University of Gonzaga," just FYI.
I appreciate the correction! I'll update.
The product I'd like to see us purchase would protect student data from agentic AI. Of course, it would inevitably derive from the models that created the threat, and when we start to evaluate alternate products we would be in danger of what anthropic euphemistically calls agentic misalignment (robotic extortion), and we'll lose the arms race when AI starts upgrading itself https://cset.georgetown.edu/publication/when-ai-builds-ai/...
The only upside I see to having human AI vendors on campus: at least we can see them coming (so they'll be less invasive than their products).