From Government to the Classroom: What the DenAI Summit Taught Me About AI in Education
- Sarah Levy

- Oct 16
- 3 min read

As someone who works with Jewish day schools and educational organizations to help them design intentional, human-centered systems for teaching and learning, I spend a lot of time thinking about how innovation meets reality.
So when I attended the DenAI Summit -- a gathering largely focused on how governments and the social sector can use artificial intelligence to improve lives -- I went with one question in mind:
What can schools learn from the way other sectors are approaching AI?
The conversations may have centered on public policy, procurement, and governance, but again and again, I found myself thinking: this is exactly the conversation education needs to be having.
Disruption, Opportunity, and the Human Element
AI is the biggest disruptor of our time. That’s not hyperbole; it’s a recognition that we suddenly have tools capable of transforming how we tackle long-standing challenges. But disruption always comes with discomfort.
One speaker said, “Disruption can be painful, and it can create opportunity.” In education, that line hit hard. We know this from every curriculum shift, every new assessment model, every “initiative of the year.” The question isn’t whether disruption will happen; it’s how we choose to navigate it.
We can either cling to old processes and miss transformative opportunities, or we can see AI as a way to turbo-charge the progress we’ve been too busy (or too constrained) to pursue.
The key, though, is remembering what the summit speakers emphasized again and again: AI must augment, not replace, the human.
Teachers, leaders, and students are not cogs in a machine. The magic of learning happens in the messy, relational, creative space that no algorithm can replicate. The goal is not to automate empathy or curiosity, but to create more space for it.
Ethics, Empathy, and Agency
One panelist noted that a lack of empathy is a public health issue. That sentence has stayed with me.
When we think about AI literacy in schools, it’s easy to focus on the technical -- how to prompt, detect bias, or cite AI-generated work. But just as critical is teaching students how to use AI with empathy and agency.
We need to help students recognize bias, both in AI and in themselves. We need to teach them to question, to reframe, and to design with compassion. Because ethics isn’t the thing that slows innovation down; it’s what optimizes it.
If we do this right, AI literacy becomes not just a tech skill but a character skill. It becomes a new form of preparedness, one that combines resilience, reflection, and responsibility.
Lessons from the Public Sector
I was struck by how much of the government conversation mirrored what I hear in schools:
“Start with the problem, not the tool.”
“Build your people’s capacity, not just your tech stack.”
“Align internally before you implement externally.”
Sound familiar?
Schools, like governments, are complex systems. When we rush to adopt new tools before clearly defining the problem we’re solving -- or before aligning our team’s mindset -- we set ourselves up for friction, confusion, and fatigue.
The public sector is learning to approach AI through use cases that meet real needs. Education should do the same. The goal isn’t to “use AI” for the sake of using AI; it’s to make teaching more joyful, learning more personalized, and schools more effective.
Hope, Humility, and the Leapfrog Moment
Governor Jared Polis closed the summit by saying, “There’s no playbook yet for what successful AI transformation looks like.” That uncertainty can feel daunting, but it’s also liberating.
We have a chance to leapfrog years of substandard systems, to make our schools more effective, efficient, and human-centered.
The challenge is to approach this leap with both hope and humility -- hope that AI can help us meet the needs of students we haven’t yet been able to reach, and humility to recognize that we won’t always get it right on the first try.
If disruption is inevitable, let’s make it purposeful. Let’s make it human. And let’s make it about learning.
If you could redesign one small process in your school using AI—not to replace the human touch, but to strengthen it—what would it be?




Comments