Many in Silicon Valley are obsessed with sci-fi warnings of terminator-style AI destroying humankind. X-risk has become the shiny new toy that's distracted attention from current problems. Over the past week, several members of OpenAI’s board were seemingly willing to burn down their company because they believed that Sam Altman was moving to fast toward building actual AI, not just a chatbot. It's like forecasting your toddler could someday join the NBA and pouring your time and money into making that happen instead of teaching them to tie their shoes. Society is facing issues right now stemming from current AI models that are already more capable than any programs in history. But few appear to be bothered with the educational equivalents of shoe-tying and potty training when they've got their sights set on playing god and building a true thinking machine.
Nov 25, 2023·edited Nov 25, 2023Liked by Marc Watkins
This is a very thought-provoking article, Marc. I think the reason this is happening is because it feels empowering. Visualizing a better tomorrow creates an illusion of predictability and control amidst the uncertainty of life. Research shows perceived control reduces stress and anxiety. Humans fantasize to get an artificial feeling of influence over uncontrollable events. And as Morgan Housel once said, "People don't want the truth. They want to reduce the uncertainty that is in their heads."
To solve this pressing issue, in my opinion, we need to start rethinking everything we know about education today because AI's influence will only continue changing the landscape. We need to rethink assessments, content, delivery, everything. This undertaking requires open-minded discussion questioning assumptions on everything from content divisions to evaluation models in light of AI influence. Stakeholders from all around the world need to give a new meaning to education as we move forward. One that perfectly aligns with a world of AI.
This is a very thought-provoking article, Marc. I think the reason this is happening is because it feels empowering. Visualizing a better tomorrow creates an illusion of predictability and control amidst the uncertainty of life. Research shows perceived control reduces stress and anxiety. Humans fantasize to get an artificial feeling of influence over uncontrollable events. And as Morgan Housel once said, "People don't want the truth. They want to reduce the uncertainty that is in their heads."
To solve this pressing issue, in my opinion, we need to start rethinking everything we know about education today because AI's influence will only continue changing the landscape. We need to rethink assessments, content, delivery, everything. This undertaking requires open-minded discussion questioning assumptions on everything from content divisions to evaluation models in light of AI influence. Stakeholders from all around the world need to give a new meaning to education as we move forward. One that perfectly aligns with a world of AI.