9 Comments

Thank you! Many faculty members are resisting AI labeling or disclosure, often dismissing it with comments like, “Aww, come on, we’ve all been using Grammarly.” 🤦🏽‍♀️ But you know what will ultimately drive change? Promotion and tenure requirements, along with scholarly communication policies. I conducted an analysis earlier this year and found that most major publishers have already implemented clear AI disclosure requirements. For example, here’s the AI policy from Taylor & Francis: https://taylorandfrancis.com/our-policies/ai-policy/. (Labels are coming!) Additionally, Google announced their AI detection tool just last month (works better for longform text than short social media posts) meaning AI use will increasingly be enforceable in journal and book submissions. (Still doesn't mean AI detection is foolproof!)

I also wanted to mention that the Authors Guild is doing fascinating work in this area. All of this gives me hope that, while government regulation of AI may be stalled, the American story of techno-science—the way technology and science are often harnessed by capitalism to drive profit—will still be powered by the human spirit. I remain hopeful about real chances for a more ethical integration of AI, one that balances innovation with accountability and creativity. Thanks. again!

Expand full comment

Thank you for saying it out loud. Both myself and my students, who are reading "Burning Data" from RESET, by Ronald Deibert, talked about this issue "The Day After".

We should be collecting data on AI use and potential learning loss and the things it's not good for when the user is learner still acquiring literate discourse so we can establish a counter-narrative to the commercialized utopianism currently out there

Christine Ross,

Rochester Institute of Technoloigy

Expand full comment

Thanks for this - and generally for your thoughtful work on this issue. Your suggestions make sense - but I’d say for now and in certain contexts. That is, they might work with students who already have a reasonable level of knowledge and capacity for critical judgement (as one would hope). But if we look a few years ahead, when we have students entering University whose earlier experience of education has been shaped by AI, where it’s possible there will also be reduced ‘friction’ and ‘desirable difficulty’, where a reliance on AI might be quite habitual, what then? As the comment above indicates, the conversation about AI needs to consider the whole span of education and the wider developmental process.

Expand full comment

The only option is to engage students. How we do so is the real question. I appreciate the approaches here, and in the K12 space the conversations are similar. Teachers are not only wondering about how and where to allow for AI, but that discourse is leading to bigger conversations about assessment, teaching, and learning.

Though there isn't any uniform agreement (and likely won't be), the fact that AI is leading to broader conversations about education is the real paradigm shift. Part of that shift includes openly talking to students about the tools they are using. At the very least, it moves toward effective guidelines. At best, though, it builds better relationships.

Expand full comment

Well, it also depends on the age of your students. Just noting this. I teach a lot of grade school students besides graduate students. It's not a one sized fits all approach. But digital literacy can be taught young, just not in this way. I make this comment because I believe it is with our very young, we need to engender an appreciation of our human capabilities, capacities and a view of technology as not some shiny, steely immortal thing but one of pluses and minuses, biceps and saggy bottoms.

Expand full comment

When the students begin to study at college, they choose a topic which interests them as well as one in which they can make later use. It is not simply for them to get qualified by passing examinations. Unfortunately this need is often how things work out, when the subject ceases to interest them and to study something else which does would cost them too much, so they plow on with their lives in a semi-professional way. Only the strong minded ones make the change and succeed in doing something later in life that they can enjoy as well as provide useful ideas or produce. If you can begin with a life-long subject of interest and concern, you are luckier than most of us.

Expand full comment

Wow. Thank you for this. I kept thinking about the stat from the Microsoft/Linkedin Work Trend Index about how the majority of people hiring prefer AI skills over job experience, but there's a big disconnect when our schools view AI as cheating. Educating students to use AI ethically and to critically evaluate AI's role in their own learning process is the first step towards these 'skills' and ensuring a better chance of 'shared prosperity,' as Altman says.

Expand full comment

An obviously important issue given due consideration here. (Not sure why it was framed as a government oversight topic, though.) In the classroom, the teacher has direct responsibility, within the restrictions of administrative policies, and I personally see forms of 'cheating' by abuse of ChatGPT et alia by students primarily as a classroom management problem. I fully agree that teachers must first understand this technology well (know thine enemy, if that is how one views it), understand how and why their students use it, and be entirely explicit with rules concerning its use. But a pro-active approach is definitely best. We are educators, not police.

Expand full comment

Great to read this article. I was talking with my friend, a college professor who teaches journalism and media. She talked about the struggles that AI has created in education. Discussions about ethics and cheating have been met with rolled eyes. The emphasis is on using the tools and the positive sides of AI.

I mentioned to her that there is a lot of discussion (at least in the tech world) about soft skills. And how these skills are more important in getting opportunities than some of the technical skills you have. I would hope that the students who don't see ethics or cheating as a concern will be enlightened when they get out in the real world and start looking for jobs.

Expand full comment