This is a thought experiment, not a prediction—but it’s a clarifying one.
What happens to AI and money if a solar flare knocks out the grid and electricity doesn’t come back?
Not temporarily. Not a few weeks of disruption.
Just… done.
If you follow that question honestly, a lot of modern certainty quietly unravels.
AI Doesn’t Adapt. It Disappears.
The first thing to say—because it matters—is that AI doesn’t struggle in that world. It doesn’t adapt. It doesn’t limp along.
It vanishes.
AI is not stored intelligence. It’s not wisdom sitting patiently on a shelf. It’s a live process—a continuous performance that only exists while electricity flows, servers hum, cooling systems run, and networks synchronize.
Take away the current and what’s left isn’t a dormant mind, it’s inert matter. Silicon fossils.
Which is uncomfortable, because we talk about AI as if it’s durable. As if it’s something we’re “creating” rather than something we’re constantly propping up.
The lights go out, and AI disappears faster than most social institutions. Faster than markets. Faster than governments. Probably faster than people expect.
That alone should tell us something.
When the Ledger Stops Updating, Money Becomes a Story.
Money doesn’t disappear as quickly—but it fractures.
Digital money is the first to go. Bank balances, payment apps, crypto wallets, market prices—all of it depends on continuous verification. Ledgers that don’t update stop being ledgers. Numbers that can’t be checked stop being numbers and start being stories.
And stories without shared enforcement don’t function as money.
What replaces it isn’t chaos right away. That’s a myth we tell ourselves. Historically, what comes first is reversion.
People fall back to what works without infrastructure.
Food. Water. Medicine. Fuel. Shelter. Skills.
If you can fix things, grow things, heal people, organize labor, or keep others safe, you are suddenly very wealthy—regardless of what your bank account used to say.
Value becomes embodied again.
After that comes local agreement. Barter, yes—but also trust. Handwritten ledgers. Favors remembered. Debts enforced socially instead of digitally. Money shrinks to human scale because that’s the scale at which trust can operate without abstraction.
This isn’t a romantic return to some golden past. It’s a constraint-driven correction.
Money only works when belief is shared. Electricity outsourced that belief to machines. Without it, belief has to live somewhere else—usually in relationships.
Optimization Collapses Without Context.
This is where the AI conversation gets more interesting.
Because the collapse doesn’t just reveal that AI is fragile. It reveals why.
AI, as we’re building it now, is optimization without embodiment. Pattern extraction without lived context. Intelligence that assumes abundant energy, global coordination, and frictionless abstraction.
It’s very good at navigating inside the box we built for it.
But it doesn’t know how to orient itself when the box disappears.
Humans do.
Not because we’re smarter in some narrow, benchmarked way—but because human intelligence is shaped by scarcity, constraint, and meaning. We evolved to make sense of the world when information is incomplete, energy is limited, and survival depends on cooperation.
We don’t optimize first.
We orient first.
When the grid collapses, optimization loses its substrate. Orientation doesn’t.
Resilience Isn’t the Same as Sophistication.
There’s a deeper irony here.
We often justify AI and abstract financial systems by saying they make society more resilient. More efficient. More advanced.
But resilience doesn’t come from complexity alone. It comes from redundancy, locality, and the ability to fail gracefully.
AI fails catastrophically.
So does digital money.
They don’t degrade into simpler versions of themselves. They don’t shrink to fit new conditions. They just… stop.
What survives instead are systems that were never fully abstracted in the first place:
– Human memory – Skilled hands – Cultural knowledge – Moral norms – Stories that carry lessons across generations
None of these require servers. All of them require care.
This Isn’t About Solar Flares.
This thought experiment isn’t really about solar flares.
It’s about what we mistake for intelligence.
It’s about what we mistake for value.
When energy is abundant, we confuse scale with wisdom and speed with understanding. We start believing that the map is the territory, that the metric is the meaning, that optimization is progress.
A world without electricity strips those illusions bare.
It reminds us that money is a coordination tool, not wealth itself.
That AI is a mirror and an amplifier, not an autonomous mind.
That abstraction is powerful—but only when it’s anchored to lived reality.
If AI Returns, It Will Be Smaller—and That Matters.
If AI ever returns in such a world, it wouldn’t look like what we’re building now.
It wouldn’t be centralized. It wouldn’t be sovereign. It wouldn’t pretend to replace human judgment.
It would be small. Purpose-bound. Embedded in tools rather than platforms. Closer to a compass than a commander.
Which raises an uncomfortable question for the present moment:
If we know that our most advanced systems can’t survive without massive energy and coordination, why are we designing them as if permanence is guaranteed?
What does it say about our values that we optimize for capability instead of coherence?
Foundations vs. Scaffolding
I don’t think the lesson here is to reject AI or modern money.
I think the lesson is to stop mistaking them for foundations.
Foundations are what remain when the lights go out.
Orientation. Meaning. Trust. Care. Human judgment.
Everything else is scaffolding.
Useful. Powerful.
And far more fragile than we like to admit.
If we want humane, resilient systems—AI included—we should design as if the lights might someday go out.
Because whatever survives that test is what we were actually building for all along.
—
Inspired by the H11 project.