A Sense That More Is Possible

To teach people about a topic you’ve labeled “rationality,” it helps for them to be interested in “rationality.” (There are less direct ways to teach people how to attain the map that reflects the territory, or optimize reality according to their values; but the explicit method is the course I tend to take.)

And when people explain why they’re not interested in rationality, one of the most commonly proffered reasons tends to be like: “Oh, I’ve known a couple of rational people and they didn’t seem any happier.”

Who are they thinking of? Probably an Objectivist or some such. Maybe someone they know who’s an ordinary scientist. Or an ordinary atheist.

That’s really not a whole lot of rationality, as I have previously said.

Even if you limit yourself to people who can derive Bayes’s Theorem—which is going to eliminate, what, 98% of the above personnel?—that’s still not a whole lot of rationality. I mean, it’s a pretty basic theorem.

Since the beginning I’ve had a sense that there ought to be some discipline of cognition, some art of thinking, the studying of which would make its students visibly more competent, more formidable: the equivalent of Taking a Level in Awesome.

But when I look around me in the real world, I don’t see that. Sometimes I see a hint, an echo, of what I think should be possible, when I read the writings of folks like Robyn Dawes, Daniel Gilbert, John Tooby, and Leda Cosmides. A few very rare and very senior researchers in psychological sciences, who visibly care a lot about rationality—to the point, I suspect, of making their colleagues feel uncomfortable, because it’s not cool to care that much. I can see that they’ve found a rhythm, a unity that begins to pervade their arguments—

Yet even that… isn’t really a whole lot of rationality either.

Even among those few who impress me with a hint of dawning formidability—I don’t think that their mastery of rationality could compare to, say, John Conway’s mastery of math. The base knowledge that we drew upon to build our understanding—if you extracted only the parts we used, and not everything we had to study to find it—it’s probably not comparable to what a professional nuclear engineer knows about nuclear engineering. It may not even be comparable to what a construction engineer knows about bridges. We practice our skills, we do, in the ad-hoc ways we taught ourselves; but that practice probably doesn’t compare to the training regimen an Olympic runner goes through, or maybe even an ordinary professional tennis player.

And the root of this problem, I do suspect, is that we haven’t really gotten together and systematized our skills. We’ve had to create all of this for ourselves, ad-hoc, and there’s a limit to how much one mind can do, even if it can manage to draw upon work done in outside fields.

The chief obstacle to doing this the way it really should be done is the difficulty of testing the results of rationality training programs, so you can have evidence-based training methods. I will write more about this, because I think that recognizing successful training and distinguishing it from failure is the essential, blocking obstacle.

There are experiments done now and again on debiasing interventions for particular biases, but it tends to be something like, “Make the students practice this for an hour, then test them two weeks later.” Not, “Run half the signups through version A of the three-month summer training program, and half through version B, and survey them five years later.” You can see, here, the implied amount of effort that I think would go into a training program for people who were Really Serious about rationality, as opposed to the attitude of taking Casual Potshots That Require Like An Hour Of Effort Or Something.

Daniel Burfoot brilliantly suggests that this is why intelligence seems to be such a big factor in rationality—that when you’re improvising everything ad-hoc with very little training or systematic practice, intelligence ends up being the most important factor in what’s left.

Why aren’t “rationalists” surrounded by a visible aura of formidability? Why aren’t they found at the top level of every elite selected on any basis that has anything to do with thought? Why do most “rationalists” just seem like ordinary people, perhaps of moderately above-average intelligence, with one more hobbyhorse to ride?

Of this there are several answers; but one of them, surely, is that they have received less systematic training of rationality in a less systematic context than a first-dan black belt gets in hitting people.

I do not except myself from this criticism. I am no beisutsukai, because there are limits to how much Art you can create on your own, and how well you can guess without evidence-based statistics on the results. I know about a single use of rationality, which might be termed “reduction of confusing cognitions.” This I asked of my brain; this it has given me. There are other arts, I think, that a mature rationality training program would not neglect to teach, which would make me stronger and happier and more effective—if I could just go through a standardized training program using the cream of teaching methods experimentally demonstrated to be effective. But the kind of tremendous, focused effort that I put into creating my single sub-art of rationality from scratch—my life doesn’t have room for more than one of those.

I consider myself something more than a first-dan black belt, and less. I can punch through brick and I’m working on steel along my way to adamantine, but I have a mere casual street-fighter’s grasp of how to kick or throw or block.

Why are there schools of martial arts, but not rationality dojos? (This was the first question I asked in my first blog post.) Is it more important to hit people than to think?

No, but it’s easier to verify when you have hit someone. That’s part of it, a highly central part.

But maybe even more importantly—there are people out there who want to hit, and who have the idea that there ought to be a systematic art of hitting that makes you into a visibly more formidable fighter, with a speed and grace and strength beyond the struggles of the unpracticed. So they go to a school that promises to teach that. And that school exists because, long ago, some people had the sense that more was possible. And they got together and shared their techniques and practiced and formalized and practiced and developed the Systematic Art of Hitting. They pushed themselves that far because they thought they should be awesome and they were willing to put some back into it.

Now—they got somewhere with that aspiration, unlike a thousand other aspirations of awesomeness that failed, because they could tell when they had hit someone; and the schools competed against each other regularly in realistic contests with clearly-defined winners.

But before even that—there was first the aspiration, the wish to become stronger, a sense that more was possible. A vision of a speed and grace and strength that they did not already possess, but could possess, if they were willing to put in a lot of work, that drove them to systematize and train and test.

Why don’t we have an Art of Rationality?

Third, because current “rationalists” have trouble working in groups: of this I shall speak more.

Second, because it is hard to verify success in training, or which of two schools is the stronger.

But first, because people lack the sense that rationality is something that should be systematized and trained and tested like a martial art, that should have as much knowledge behind it as nuclear engineering, whose superstars should practice as hard as chess grandmasters, whose successful practitioners should be surrounded by an evident aura of awesome.

And conversely they don’t look at the lack of visibly greater formidability, and say, “We must be doing something wrong.”

“Rationality” just seems like one more hobby or hobbyhorse, that people talk about at parties; an adopted mode of conversational attire with few or no real consequences; and it doesn’t seem like there’s anything wrong about that, either.

Raising the Sanity Waterline




Epistemic Viciousness