The key point: many activists smuggle in infinite costs into their reasoning, which can lead them to think any extreme action is therefore justified. And they hoodwink you by omitting all the other issues that have similar scale that we muddle through, as well.
(I wanted a term for this, and thought it already existed. But I couldn’t find it. So I’m writing it here. I recall it coming from a good back-and-forth with James, but can’t find it.)
A smuggled infinity is when you sneak in a huge assumption without really pointing it out. It's like saying something is so big, so massive, that it throws everything else out of whack, but you don't really say it outright. This kind of thinking can lead us down some pretty extreme paths if we're not careful.
The idea of "smuggled infinity" is not new. The classical logical fallacy of Pascal’s Wager, which predates me by a few hundred years, involves it. Pascal basically said that believing in God is a smart bet because the payoff could be endless (infinite) happiness in the afterlife. The catch? He's assuming the afterlife is this infinite, all-important thing, making everything else seem tiny in comparison. But that's a massive leap, and not everyone's buying it. Furthermore, who’s to say that specific religion is the right one? What about the other permutations that send you to infinite hell for doing certain things?
We shouldn’t be so arrogant to think we live in a less-religious age and therefore think we don’t make this logical fallacy as much. The accidental infinities are smuggled in places that should be on the other side of a separation between church and state. Consider today's big issues like climate change, AI risks, and pandemics:
Climate Change: Sure, the dangers of messing up our planet are real and serious. But if we start thinking the cost of environmental damage is infinite, we might make some rash decisions. Like, completely shutting down factories overnight. Or shutting down nuclear plants to be replaced by coal, which kills far more people per watt.
AI Risks: Talking about AI, there's this fear that it could literally be the end of us. But if we treat this risk as infinite, we might end up slamming the brakes on AI research altogether. One prominent AI-risk activists follows this logical line of thinking to recommend air strikes on data centers worldwide.
Pandemics: With virus outbreaks, the fear of losing countless lives can lead us to go overboard with health measures. We're talking lockdowns, curfews, the works. While being safe is key, we've also got to think about the other costs – like how these measures affect people's daily lives, mental health, and the economy.
In each case, the risks are real, but when someone smuggles in an infinite cost in their argument, that can justify abhorrent actions that are way more extreme than necessary.
If you want to read more on this general topic, Doom, from Niall Ferguson is good.