Discussion about this post

User's avatar
Marcus Luther's avatar

[1] Love this mindset around normalizing disclosure rather than living in the false dichotomy you name. It is reasonable and it is achievable—and it is also a fair standard to hold educators to (which is my primary concern right now, much more than student usage).

[2] I do think students are in an incredibly precarious position at the moment, though, in having to navigate different expectations and consequences around AI from classroom to classroom (or sometimes within a given classroom). I know educators are doing their best, but the consequences are severe and trust-destroying not just in individual classrooms, but more broadly.

[3] I agree that the need is there for these conversations in our classrooms—but I don't have a ton of faith that most of us (raises hand) have the support and knowledge necessary to facilitate those conversations, especially with a landscape in education and beyond on this topic that continues to move. I have very little faith in ad hoc conversations happening in a way that substantively moves the needle in a positive direction—this needs to be institutional and collaborative and normed, and that's beyond any of our individual classrooms, right?

Expand full comment
Annette Vee's avatar

Once again, Marc is very observant! AI is here, and while we didn't design this tech, we're living in this world. We have a responsibility to our students to help them discern and reflect on AI's role in their lives and writing. This description of approaching AI among students mirrors mine, and I really like the way you put it:

"This fall, I’ve asked my students to adopt open disclosure if they use an AI tool, reflect on what it offers or hinders their learning, and use restorative practices to try and help them understand that misusing generative AI isn’t about rule-breaking; it impacts the ethical framework of trust and accountability we’re trying to establish as a class."

Expand full comment
8 more comments...

No posts