Fallacies Of Reversed Moderation

By Scott Alexander

A recent discussion: somebody asked why people in Silicon Valley thought that only high-tech solutions to climate change (like carbon capture or geoengineering) mattered, and why they dismissed more typical solutions like international cooperation and political activism.

Another person cited statements from the relevant Silicon Valley people, who mostly say that they think political solutions and environmental activism were central to the fight against climate change, but that we should look into high-tech solutions too.

This is a pattern I see again and again.

Popular consensus believes 100% X, and absolutely 0% Y.

A few iconoclasts say that X is definitely right and important, but maybe we should also think about Y sometimes.

The popular consensus reacts “How can you think that it’s 100% Y, and that X is completely irrelevant? That’s so extremist!”

Some common forms of this:

Reversed moderation of planning, like in the geoengineering example. One group wants to solve the problem 100% through political solutions, another group wants 90% political and 10% technological, and the first group thinks the second only cares about technological solutions.

Reversed moderation of importance. For example, a lot of psychologists talk as if all human behavior is learned. Then when geneticists point to experiments showing behavior is about 50% genetic, they get accused of saying that “only genes matter” and lectured on how the world is more complex and subtle than that.

Reversed moderation of interest. For example, if a vegetarian shows any concern about animal rights, they might get told they’re “obsessed with animals” or they “care about animals more than humans”.

Reversed moderation of certainty. See for example my previous article Two Kinds Of Caution. Some researcher points out a possibility that superintelligent AI might be dangerous, and suggests looking into this possibility. Then people say it doesn’t matter, and we don’t have to worry about it, and criticize the researcher for believing he can “predict the future” or thinking “we can see decades ahead”. But “here is a possibility we need to investigate” is a much less certain claim than “no, that possibility definitely will not happen”.

I can see why this pattern is tempting. If somebody said the US should allocate 50% of its defense budget to the usual global threats, and 50% to the threat of reptilian space invaders, then even though the plan contains the number “50-50” it would not be a “moderate” proposal. You would think of it as “that crazy plan about fighting space reptiles”, and you would be right to do so. But in this case the proper counterargument is to say “there is no reason to spend any money fighting space reptiles”, not “it’s so immoderate to spend literally 100% of our budget breeding space mongooses”. “Moderate” is not the same as “50-50” is not the same as “good”. Just say “Even though this program leaves some money for normal defense purposes, it’s stupid”. You don’t have to deny that it leaves anything at all.

Or if someone says there’s a 10% chance space reptiles will invade, just say “No, the number is basically zero”. Don’t say “I can’t believe you’re certain there will be an alien invasion, don’t you know there’s never any certainty in this world?”

But I can see why this happens. Imagine the US currently devotes 100% of its defense budget to countering Russia. Some analyst determines that although Russia deserves 90% of resources, the Pentagon should also use 10% to counter China. Since no one person can shift very much of the defense budget, this analyst might spend all her time arguing we need to counter China more, trying to convince everyone that China is really very dangerous; if she succeeds, maybe the budget will shift to 99-to-1 and she’ll have done the best she can. But if she really spends all her time talking about China, this might look to other people like she’s an extremist – that crazy single-issue China person – “Why are you spending all your time talking about China? Don’t you realize Russia is important too?” Still, she’s taking the right strategy, and it’s hard to figure out what she could do better.

I am nervous titling this “reversed moderation fallacy” because any time someone brings up fallacies, people accuse them of thinking all discussion consists of identifying and jumping on Officially Designated Fallacies in someone else’s work. But I’ve gone years without talking about fallacies at all, so when this inevitably happens it’s going here as Exhibit A.