There is a particular kind of madness that only highly intelligent civilizations can achieve: the capacity to be so absorbed in solving a problem that they accelerate the very conditions that caused it. We are living inside that madness right now — and the strangest part is how rational it all looks from the inside.

Spoken in plain terms, we are dismantling the operating system of life on Earth in order to upgrade a simulation of it.

The Interplays We Keep Forgetting

Indigenous cultures have long understood that smaller systems exist within, and depend upon, larger ones—a dynamic that academics refer to as nested hierarchy. A cell depends on an organ. An organ depends on a body. A business depends on a society. A society depends on a biosphere. Violate the outer layer, and everything nested inside it eventually collapses, regardless of how elegantly optimized the inner layers become.

Beyond metaphors, this interdependency has been confirmed across disciplines. It is thermodynamics, ecology, and evolutionary biology all saying the same thing in different dialects.

Yet contemporary civilization often seems to behave as though the hierarchy is inverted—as if the biosphere exists to serve the economy, and the economy exists to serve the technology. We treat the largest, most complex, most irreplaceable system—the living Earth, with its 4-billion-year-old architecture of feedback loops, nutrient cycles, and biodiversity—as a mine, while treating a 5-year-old large language model as an asset.

The Psychology of Proximity Bias

Why do we do this? The answer is far more insidious than stupidity and even greed. I argue that it’s actually proximity bias, the deeply wired human tendency to weigh what is close, visible, and measurable over what is distant, invisible, and complex.

Artificial intelligence is proximate. Proponents claim that it makes our work more convenient, saves us time, and produces outputs we can benchmark and monetize within a fiscal quarter. Natural intelligence—the symbiotic network coordinating an old-growth forest, the epigenetic wisdom encoded in a human gut microbiome, the collective behavioral intelligence of a school of fish—produces no quarterly report. It exists beyond the resolution of our dashboards, and so, for all practical purposes of modern governance, it barely exists at all.

This is what Daniel Kahneman called a failure of System 2 thinking, applied at a civilizational scale: We are operating on heuristic autopilot, optimizing the metrics we can see while ignoring the systemic costs we cannot. We are not evil. We are cognitively under-equipped for the complexity of our own consequences.

The Energy Equation No One Wants to Solve

To be more specific, the International Energy Agency projects that, by the end of 2026, data centers could consume more electricity than the entirety of Japan. Meanwhile, 1.3 billion people still lack reliable access to electricity, and the agricultural systems that feed eight billion humans are being destabilized by the very climate disruption that AI’s energy appetite accelerates.

The feedback loop is almost elegant in its brutality: We build AI to (in theory) solve problems caused by industrial complexity, which requires industrial energy to run, which deepens the ecological crisis, which generates more urgent problems, which require more AI to solve. This is progress nested inside a civilizational death spiral—efficiency gains that produce not conservation, but escalation.

Hunger Beneath The Hype

Somewhere between the conference rooms where trillion-dollar AI investments are announced and the data centers consuming river systems’ worth of cooling water, 733 million people go to bed hungry each night. This is not an unfortunate coincidence. It is a system signal.

When capital flows toward artificial general intelligence and away from regenerative agriculture, soil science, seed diversity, and rural water infrastructure, we are making something far worse than a strategic error. We are revealing a painful value architecture—one in which mimicking human cognition at scale is considered more fundable than sustaining the biological conditions under which cognition evolved.

Natural intelligence is the emergent property of systems that have been stress-tested, iterated, and refined across time and space. The immune intelligence in your bloodstream. The hydrological intelligence of a wetland. The pollination logic of a prairie. These systems took billions of years to optimize. We are on a path to forfeit them in decades—not because we must, but because they don’t show up on balance sheets, and hence do not “count” as valuable.

Intelligence Essential Reads

The Transition We Are Missing

I am not making an argument against artificial intelligence. It is an argument against uncritical substitution thinking—the belief that we can replace the foundations of life with engineered proxies and lose nothing essential.

The transition we need is not from analog to digital, or from natural to artificial. It is from extractive to regenerative—a shift in the operating logic of civilization itself. In my view, this means investing in AI and in NI, in meaning and machines, with a holistic understanding of their interplays with the future of humanity and the planet.

The paradox of our time is that we are smart enough to build artificial minds yet too dumb to protect natural ones. It is that we have organized our smartness around the wrong axis. We are optimizing a subsystem at the expense of the whole and calling it innovation. It is a prioritization that children are bearing the cost of. In the UK and other high-income countries, children now spend less time in nature than maximum-security prison inmates. How can they develop the entire range of their inherent potential, cut from the ecosystems they come from and depend on?

The ABCD of Action

Systems change when enough individuals—particularly those with agency—reorient their behavior. Here is where to begin:

Aspire to develop an eye for interplays. Adopt the discipline of asking, for every technological investment or personal choice: What is the larger system this depends on, and am I strengthening or weakening it? Let that question interrupt your autopilot.

Believe that natural intelligence has irreplaceable value, both intuitively and as science. The evidence is overwhelming and growing. When you internalize this, your decisions about food, energy, attention, and investment shift structurally.

Choose where you direct your resources. Allocate your time, money, and effort with ecological coherence in mind. Support organizations regenerating soil, protecting biodiversity, and building food sovereignty alongside any enthusiasm you carry for technological innovation. The two are not in opposition—unless we allow them to be.

Do one visible, concrete thing this week to experience the environment you are part of: plant something, restore something, defend something living. Then talk about it.

We cannot compute our way out of a planet we have consumed. Code generated in the name of life will not sustain the living beings that we (still) are.