From Point Solutions to Transformation: A Roadmap for Funders and Policymakers
Getting beyond the lightbulb stage: A new CRPE brief
I just got back from the SCALE conference in Palo Alto and the NewSchools conference in San Francisco. At both, there was a lot of discussion about how the promise of AI is still unrealized. That conversation kept circling back to the same question: Why? A new brief that my colleague Shira Haderlein and I released today helps explain it.
We set out to document what we thought was a supply problem. We asked: Which AI education tools are missing? What do developers not understand about how schools work? What’s leaving educators without what they need?
We talked to more than 50 people who know this space well: funders, developers, parents, researchers, district leaders, policy experts, and intermediaries. These are all smart, thoughtful people who have been working on this for years. But what they told us implied that we were asking the wrong question to begin with.
Yes, there are some missing tools. But that is not the central problem. The problem is that the field is drowning in tools and starving for vision. One respondent put it bluntly: “We are likely over-tooled and over-solutioned at the moment.” Another said funders are “WAY too focused on tools.” The picture that emerged across interview after interview was of a field running very fast in the wrong direction.
The field is overwhelmed by point solutions.
School districts are swamped with pitches for AI tools that promise to summarize lesson plans, generate quizzes, or flag a struggling student. The more sophisticated tools offer customized tutors. While some of these are genuinely useful at solving small, specific problems, they do not add up to effective, broader change.
What’s missing is integration and coherence. We need tools that connect to each other, that pair with a schoolwide professional development plan and instructional strategy, that support a real vision of what learning could look like, and that reorganize the fundamental assumptions about how time and talent are deployed within a school.
The tools are not addressing fundamental problems.
Right now, most AI tools aim to make the existing system run slightly better, but the existing system has deep structural problems that AI tools are not generally designed to address. One comment stuck with me: “The problem is that kids don’t like school.” Another: “It’s abundantly clear that young people need more social connectedness, social support, PERIOD.”
Today’s high schools are designed for a world that no longer exists. Teachers are burnt out, underprepared, and unable to develop strong connections with 30-plus students. These are the urgent problems, and AI is barely touching them.
And all the while, the future is barreling toward us. As I wrote earlier this year, agentic AI is already reshaping white-collar work, and the pace is accelerating. Whatever career preparation meant five years ago, it means something different now. Our AI tools are not keeping up with that reality.
Evidence and policy are afterthoughts, not foundations.
The way AI tools currently enter schools should worry everyone. A developer builds a product, finds some early adopters, and scales. At some point, someone asks whether it works. At some point after that, a district or state tries to figure out what the rules should be.
That sequence is backward. The evidence base for AI in education is embarrassingly thin. Developers have very few incentives to collect that evidence, and funders have been too permissive about letting it slide. As a result, states and districts are making billion-dollar bets on tools they cannot yet evaluate.
Meanwhile, teachers are paralyzed. Without clear guidance from districts and states on what they can and cannot do with AI, many are either avoiding it or improvising in ways that may not serve students well. Policy is not keeping up with practice, and practice is not waiting for policy.
The result is a field making it up as it goes, at speed, and students are paying the price.
The people who should be driving AI adoption are not at the table.
This one hit me hardest.
When we asked who developers are consulting when they design their products, the answer was pretty consistent: district administrators, instructional coaches, and school leaders, all of whom operate within a heavily constrained policy environment and sense of what is possible.
They weren’t asking students, parents, teachers, or community members. In short, they weren’t consulting anyone who could tell them what families actually want or need, or who might have a genuine vision for what learning could look like if constraints were lifted.
Developers are optimizing for the existing system, as described by the people most embedded in it. That is how you get tools that automate the current broken model instead of reimagining it. If we had asked people in 1995 what they wanted from the internet, many would have asked for a better encyclopedia. The vision problem in AI education is similar.
The big productivity gains from electricity didn’t come from better lightbulbs. They came when factory owners stopped arranging their equipment around a central steam shaft and started rethinking the entire production floor. The lightbulb was useful, but the reconception of the factory was transformative. Today, AI in education is still in the lightbulb stage. Teachers are being handed flashlights and told to find their way through a dark building. What the field actually needs is someone willing to challenge all of the assumptions about why we’re wandering through the dark in the first place.
So what do we actually need?
Our brief offers specific recommendations.
Funders must attend to the demand side. They should ask not just “What tools should we fund?” but “Who has a genuinely transformative vision for what learning could look like, and how do we help them send different market signals to developers?” Right now, the people with the most constrained visions are setting the agenda.
Districts, states, and funders should better define the problems they are actually trying to solve. Right now, as one respondent put it, “We ask for dreamy, wishy things.” Getting specific about problems is how you get specific solutions. That means smarter procurement and coordination across major urban markets to shift what developers are building toward.
There are high-leverage pain points where AI is underdeployed, and the gains could be significant. These are the places where AI could do something genuinely new, not just faster. Matching teachers in teams based on complementary expertise. Remaking high school schedules from the ground up rather than optimizing the existing ones. Helping families navigate school choice, communicate with schools, and advocate for their kids. Improving student mental health and peer connection. “We could revolutionize special ed with AI,” one respondent told us. Investment should go to these focus areas.
We need codesign, not consultation. Students, parents, and teachers with a real vision for the future need seats at the table when tools are being built, not just a survey afterward.
We need investment in capacity and whole school designs, not just products. Most schools do not have the change management infrastructure or vision for integrated delivery models to use AI well. The limiting factor is not the tool, but the conditions for using it.
And we need evidence and policy to lead, not lag. Require developers to demonstrate effectiveness. Fund research and evidence gathering. Develop methods and measures that make sense. Set the guardrails before the field calcifies further.
A personal note
I have spent most of my career studying what makes education reform work and why it so often doesn’t. The problems we are seeing today are not new. When a promising new tool arrives, the field gets excited, but adoption spreads faster than understanding, and the results disappoint. This doesn’t happen because the new tool was bad. It happens because no one does the hard work of building the conditions for it to succeed.
I am worried we are watching that pattern repeat in real time with AI. Students and adults in our schools are not getting what they need to thrive, and the stakes are rising fast. AI is not the only solution, but it cannot achieve its own potential to reinvent schooling if we keep layering it onto a fundamentally outdated operating system. The field is starting to calcify around products and approaches that are good enough to persist but not good enough to transform. The window is closing.
AI presents a genuine opportunity to address some of education’s most stubborn systemic problems. But that opportunity is quickly passing us by. The market is misaligned, the vision is absent, and the infrastructure to use AI well does not yet exist in most schools. Closing these gaps will require funders willing to invest in capacity rather than just tools, and policymakers willing to lead rather than react.
I hope this brief helps push the conversation somewhere more useful. Read it. Share it. Disagree with it. Tell me what we missed.
Our AI research team will also be hosting a webinar in a few weeks on findings from this brief, as well as CRPE’s other key streams of AI research. Details below.
Notable New Research and Commentary (relating to the points above!)
Too Many AI Tools Without Evidence of Impact
The Case for a Smaller Tech Toolbox | Too Many Tools, Not Enough Impact
Two pieces argued that many districts are managing too many edtech tools with too little evidence of impact and called for a more selective, quality-over-quantity approach to classroom technology.
EdTech Marketing Bombardment of District Leaders
A Chalkbeat investigation found that superintendents are inundated with AI-powered edtech marketing, including unsolicited calendar invites and email messages, raising concerns about the pressure on leaders to adopt new AI tools that do not address their districts’ needs.
Tech Cannot Solve Education’s Structural Problems
Google’s Head of Learning Says AI Can’t Solve Education’s Real Problem: how to spark someone’s motivation to learn—a uniquely human endeavor led by teachers who are burning out and leaving the field.
Thoughts from Michael Horn (via LinkedIn):The last two decades taught us that simply layering technology onto existing classroom models rarely transforms learning. Rather, it amplifies features of a broken system characterized by incoherence, distraction, weak pedagogy, and a lack of focus.
According to an EdWeek commentary, “Will AI save teachers time?” is the wrong question to ask. The question is whether AI can help teachers adapt, differentiate, and engage students and make their jobs feel more manageable overall.
Policy is Not Keeping Up with Practice (Globally)
A World Economic Forum opinion piece argues that global education systems are integrating AI faster than the development of policy frameworks needed to guide them. AI adoption is only meaningful when policy helps to align curriculum, teacher preparation, accountability, safety, and long-term education goals.
Something Silly (But With Serious Implications): Goblins in the Chat
OpenAI discovered its models had developed an unsanctioned obsession with goblins and gremlins and had to add “never talk about goblins” to the system prompt to make them stop. When training the “Nerdy” personality customization, developers had unintentionally given the model “particularly high rewards for metaphors with creatures.” Over time, the goblins multiplied, and the problem got worse. Just goes to show how easily there can be unintended consequences from well-intentioned LLM training.
Final Words
“The work in AI space feels piecemeal-y and point-in-time, but if you squint, you can see pieces contributing to a reimagined future.”
from CRPE’s new brief, Getting Beyond the Lightbulb Stage: Why AI Is Not Yet Transforming Education
-RL
Thanks to June Han and Dan Silver for contributions. I sometimes use Claude to smooth and edit my writing. However, all errors and em-dashes are mine alone.






