Wicked Problems or Wicked Opportunities?
How can we maximize the opportunities AI offers while minimizing the risks?
Wicked Opportunities: Leveraging AI to Transform Education
Happy August! This week my center released what is (I think!) an important new report on AI in education. As I mentioned in my last newsletter, I believe AI will transform education (and the world) whether we like it or not. The question is not if AI will shape education, but how. Will we think of AI tools as presenting a set of wicked problems we have to manage or instead consider the wicked opportunities these tools could give us—especially if AI could help us solve at least some of the challenges that have long plagued public education?
The report was inspired by a forum we hosted a few months ago when we brought together more than 60 leaders, including state and federal policymakers, edtech innovators, and school and district leaders, to discuss how AI can drive meaningful and positive change in education.
One of my biggest takeaways from the gathering was a point Andrew Maynard made: Generative AI is shifting too quickly and holds too many unknowns for the education sector to plan how to harness it. Rather than fixate on the tools and technologies available to us today, he urged the U.S. education community to develop a clear vision for the future and partner with tech developers to realize that vision.

We focused on the need to think and act proactively to ensure that AI tools and systems promote equity and access, particularly for historically marginalized communities. We saw a critical need to begin involving students, parents, and civil rights activists in discussions about what AI could do to meet their needs and priorities.
We discussed how AI potentially offers wicked opportunities to tackle complex challenges like widening achievement gaps, mental health crises among both educators and students, teacher burnout, and a rapidly changing job market for young people.
We also had a terrific session on policy and politics, where Penny Schwinn urged state education leaders to focus on transparency and accountability for results while being “dogged” about good implementation.
However, this was not a naive group, and there was a clear acknowledgment that achieving such a positive vision will require bold leadership, a coalition of unlikely allies, and a clear understanding of each other’s roles.

The paper that emerged from this forum, therefore, calls for quick and coordinated action along those lines with a pretty cool set of specific ideas. A few of my favorites:
Design a targeted leadership and demonstration strategy, beginning with out-of-system and “edge” cases. This will require connecting advocates and system leaders who represent students with diverse learning needs (including advocates for students with disabilities and multilingual learners, parent advocates from cities with historically underserved populations, rural district leaders, microschool leaders, etc.) with edtech leaders to demonstrate the opportunities AI could create for these students.
Form a collaborative of leading districts, charter schools, technical support organizations, researchers, and foundations. The mission of this collaborative would be to rethink and redesign schools and education systems for a world where generative AI is ubiquitous. Over the next 3-5 years, participants would deeply examine and develop new school models, considering the projected impact of pervasive AI on education and learning environments.
Design an “AI big bets” philanthropic fund that advances a clear vision for AI’s potential to reimagine teaching, learning, and students’ futures, especially for students who urgently need learning acceleration and mental health support. Bring in diverse voices and perspectives to finetune the fund’s priorities. Use this fund to motivate and align philanthropic giving in a cohesive direction for the next five years and to create use cases of early adopters and toolkits for next-stage adopters.
There is a LOT of good material in this report, but we don’t see it as the final word on the subject. We hope it is the start of more conversations, ideas, and action agendas. We have entered the vast unknown when it comes to generative AI. No one should pretend to have the answers. Please read the report and send us your thoughts.
Ultimately, our group considered the need for "antifragile" education systems—those that not only withstand disruptions and unknowns but thrive on them. The only way we’ll get there is by dreaming big and working strategically.
In other news:
More Calls for R&D…
Check out this terrific op-ed from Penny Schwinn, former education commissioner of Tennessee (also quoted above), and Carey Wright, state superintendent of schools in Maryland and former Mississippi state superintendent of education. They write:
Students lost significant learning due to COVID, creating an academic gap that may take years to close. Solving this problem requires innovative programs, new platforms and evidence-based approaches. The status quo isn’t sufficient. Education leaders and policymakers need to move with urgency.
Second, America is on the cusp of a new age of technological opportunity. With AI-powered tools like ChatGPT and advances in learning analytics, researchers and developers are just beginning to tap the vast potential these technologies hold for implementing personalized learning, reducing teachers’ administrative responsibilities and improving feedback on student writing. They can even help teachers make sense of education research. Without adequate R&D, however, these technologies may fall short of their potential to help students or – worse – could interfere with learning by perpetuating bias or giving students incorrect information.
But in order to tap this vast potential, the R&D process must be structured around the pressing needs facing schools. Educators, researchers and developers must collaborate to solve real-world classroom problems.
Among other things, Schwinn and Wright call for a National Center for Advanced Development in Education at the Institute of Education Sciences, the research arm of the U.S. Department of Education.
…and an Answer from Congress!
The New Essential Education Discoveries (NEED) Act would “help schools and colleges make advancements in teaching and learning. The legislation would create a national center that advances high-risk, high-reward education research projects, similar to the model employed by the Defense Advanced Research Projects Agency (DARPA).”
ARPA-Ed has been under discussion for years, and my organization has been thinking and writing about federal and philanthropic moves to support innovation for some time now.
Tech Developments
The WSJ reported that OpenAI has, for a while, had the technology to detect cheating with AI.
And now you can talk to (with?) Chat GPT.
Instructional Essentialism?
A friend sent me this interesting Medium post by Peter Shea about the resistance to AI in education (Shea writes about higher ed, but I think the arguments should be considered at the K-12 level, as well). Lots of provocative ideas in the piece. Among them, Shea argues against “instructional essentialism,” or the belief that all learners require a skilled and caring human intermediary between themselves and an academic subject when, in reality, this rarely happens in most of today’s classrooms. I’d recommend a read. He makes good points about how AI could actually advance the known science of teaching and learning beyond what we are able to achieve in typical classrooms. I liked this quote at the end:
“AI won’t eliminate the need for teachers—but it can bring about the end of a model of teaching which is antiquated and unsupported by what we have learned over the past 50 years from research into the science of learning. AI does not represent a dark age for teaching—but potentially a golden age of learning.” - Peter Shea, Medium
Final thought
Does anyone NOT find this creepy and worrisome?
Til next time,
-Robin