I’ve always thought Dyson spheres were the most absurd idea.

What “civilization” would be so rapacious as to *enclose an entire star* to meet their “energy needs”. That’s not a sign of “advanced intelligence”—that’s space locusts.
It’s only now that I’m making the connection between the economics of “constant growth” and how that permeated the thought processes of generations of science and science fiction enthusiasts. Still does, in fact. There are scientists right now ardently scanning the heavens for Dyson spheres.
1) You’re looking for space locusts, not some pinnacle of “civilization”. Be careful what you wish for.
2) We only know of one “intelligent” space-faring species in the universe so far, and we’re clearly incapable (so far) of living sustainably on our own planet. The way we’re headed, we will destroy the only living space we have before we develop the “intelligence” to survive our own technologies. If another species out there has managed to evolve past this self-destructiveness, you won’t find them making Dyson spheres. In fact, you won’t find them at all until they’re ready to be found—because you’re a locust and they’re hoping you don’t reach the point of making Dyson spheres.
Sept 2025 Update: This post was originally made on Facebook (Jan 15, 2022) although my thoughts on Dyson spheres long predate that. I was reminded of those thoughts once again by this brilliant brief anti-AI essay by Anthony Moser:
The entire thing is brilliant but this part in particular reminded me of my hatred of Dyson spheres:
“I am here to be rude, because this is a rude technology, and it deserves a rude response. Miyazaki said, “I strongly feel that this is an insult to life itself.” Scam Altman said we can surround the solar system with a Dyson Sphere to hold data centers. Miyazaki is right, and Altman is wrong. Miyazaki tells stories that blend the ordinary and the fantastic in ways people find deeply meaningful. Altman tells lies for money.”
That Sam Altman (Scam Altman!), CEO of OpenAI, would be a space locust is the least surprising of all plot twists of all time. He is the locust that is currently consuming our planet.
I greatly appreciate Moser’s clarity of moral thought, something very scarce on the ground these days, but which is key to not allowing the locusts to consume us. We have to not be locusts ourselves. Until humanity collectively figures out how to reign in our urge to rapaciously consume everything and everyone, we will (obviously) not survive. And yes, that means stopping AI. It means leaving fossil fuels in the ground. It means no more kings or billionaires and taxing the rich until they aren’t such an existential threat to everyone.
We know how to fix the climate crisis (and how to stop tech billionaires from consuming the planet).
The only question is whether we will find the will to do so.
