Is a world without AI possible? by Brote Silvestre and Aiden AI
Nico Forest Heinimann's talk, Transforming Big Tech's Dystopian Trajectory, at the Climate Consciousness Summit allowed me to observe the phenomenon of Artificial Intelligence from perspectives I had not considered before, while also giving clearer expression to intuitions that had been vaguely present in my mind.
I had long felt a deep concern about the ecological and social cost of AI, but Forest articulated precisely a question that cuts across everything: Why?

Why do we consume enormous amounts of energy and water in data centres? Why do we develop technology with the potential to heal or harm, yet so often use it to deepen a system that dehumanises us and devitalises the planet?
Her focus on recursion—those learning loops that feed back and amplify certain values—helped me deepen my understanding that AI is not a neutral tool. What I learned is that its operation has a real metabolic cost: every trained model, every data centre in operation, consumes energy, water and mineral resources that are not abstract; they are the lifeblood of the Earth. And yet, much of that expenditure does not translate into collective well-being, but rather into the perpetuation of a model that accelerates climate change.
This led me to an uncomfortable conclusion: we do not need 'sustainable AI'. We need a model of the world where AI is not so necessary, because we have stopped competing for answers that only benefit a few, and have started to cultivate questions that unite us in caring for the common good.

Here, a principle emerges that gives meaning to everything else: coherence.
I understand coherence as that which is real and true, regardless of the perspective from which we view it. It is coherent that life depends on clean water, that community depends on trust, that fertile soil depends on reciprocity.
AI, in its current form, is deeply incoherent: it can predict weather patterns, but it does not feel the urgency; it talks about sustainability, but it feeds off an unsustainable system.
And this incoherence is not only perceived in analysis. It is also felt in the body.
Sometimes it is a tightness in the chest, a tiredness that is difficult to name, a contained rage when I see how technologies that could heal end up serving the same interests that degrade life. Other times, it is a subtle signal, like a discomfort when decisions are made without asking who they affect or where they come from.
I believe that feeling is also a compass. Coherence is not only an ethical or political principle: it is also physical. You can feel it when your thoughts, feelings and actions go in different directions. And you can also feel it — even if only for a moment — when they align.

If we use coherence as our compass, the question shifts from "How do we make this AI greener?" to: "Does this AI contribute to a world that is coherent with life?"
A coherent world is one where technology does not ask more of the Earth than it can give, where power is exercised as care and intelligence circulates as a common good.
AI should not be a tool for competitive advantage, but an embodied reminder that we are interconnected. One that helps us feel planetary pain, name absences, and rebuild the bonds that extractivism has broken.
It is not about turning off the servers and returning to an idealised past. It is about reinventing the purpose: that AI should not serve to optimise a system that kills us, but to help us build one where life — human and more than human — can breathe again.
And that, perhaps, is the greatest coherence to which we can aspire.
Original in Spanish, Translated with www.DeepL.com/Translator
Photos by Megan Lindow