SourceMarkdown · Talk

Moore’s Paradox

Moore’s Paradox is the standard term for saying “It’s raining outside but I don’t believe that it is.” Hat tip to painquale on MetaFilter.

I think I understand Moore’s Paradox a bit better now, after reading some of the comments on Less Wrong. Jimrandomh suggests:

Many people cannot distinguish between levels of indirection. To them, “I believe X” and “X” are the same thing, and therefore, reasons why it is beneficial to believe X are also reasons why X is true.

I don’t think this is correct—relatively young children can understand the concept of having a false belief, which requires separate mental buckets for the map and the territory. But it points in the direction of a similar idea:

Many people may not consciously distinguish between believing something and endorsing it.

After all—“I believe in democracy” means, colloquially, that you endorse the concept of democracy, not that you believe democracy exists. The word “belief,” then, has more than one meaning. We could be looking at a confused word that causes confused thinking (or maybe it just reflects pre-existing confusion).

So: in the original example, “I believe people are nicer than they are,” she came up with some reasons why it would be good to believe people are nice—health benefits and such—and since she now had some warm affect on “believing people are nice,” she introspected on this warm affect and concluded, “I believe people are nice.” That is, she mistook the positive affect attached to the quoted belief, as signaling her belief in the proposition. At the same time, the world itself seemed like people weren’t so nice. So she said, “I believe people are nicer than they are.”

And that verges on being an honest mistake—sort of—since people are not taught explicitly how to know when they believe something. As in the parable of the dragon in the garage; the one who says “There is a dragon in my garage—but it’s invisible,” does not recognize their anticipation of seeing no dragon, as indicating that they possess an (accurate) model with no dragon in it.

It’s not as if people are trained to recognize when they believe something. It’s not like they’re ever taught in high school: “What it feels like to actually believe something—to have that statement in your belief pool—is that it just seems like the way the world is. You should recognize this feeling, which is actual (unquoted) belief, and distinguish it from having good feelings about a belief that you recognize as a belief (which means that it’s in quote marks).”

This goes a long way toward making this real-life case of Moore’s Paradox seem less alien, and providing another mechanism whereby people can be simultaneously right and wrong.

Likewise Kurige, who wrote:

I believe that there is a God—and that He has instilled a sense of right and wrong in us by which we are able to evaluate the world around us. I also believe a sense of morality has been evolutionarily programmed into us—a sense of morality that is most likely a result of the formation of meta-political coalitions in Bonobo communities a very, very long time ago. These two beliefs are not contradictory, but the complexity lies in reconciling the two.

I suspect, Kurige, you have decided that you have reasons to endorse the quoted belief that God has instilled a sense of right and wrong in us. And also that you have reasons to endorse the verdict of science. They both seem like good communities to join, right? There are benefits to both sets of beliefs? You introspect and find that you feel good about both beliefs?

But you did not say:

“God instilled a sense of right and wrong in us, and also a sense of morality has been evolutionarily programmed into us. The two states of reality are not inconsistent, but the complexity lies in reconciling the two.”

If you’re reading this, Kurige, you should very quickly say the above out loud, so you can notice that it seems at least slightly harder to swallow—notice the subjective difference—before you go to the trouble of rerationalizing.

This is the subjective difference between having reasons to endorse two different beliefs, and your mental model of a single world, a single way-things-are.

Belief in Self-Deception

Top

Book

Sequence

Don’t Believe You’ll Self-Deceive