The book that changed how I think about thinking

March 23, 2022 0 By JohnValbyNation

I’ve learned more about how to think and reason well from Julia Galef than from almost anyone.

Galef, a writer, researcher, and podcaster, is obsessive about improving her own reasoning processes and helping other people improve theirs. For years, she led a group offering seminars and workshops for people to improve their reasoning skills. But lately, her approach has changed.

In her new book, The Scout Mindset, she argues it’s not enough to teach people which cognitive biases we all suffer from and how to avoid them. If someone wants to think more clearly, they have to cultivate an attitude of curiosity and openness to evidence.

This week, I had Galef on the Vox Conversations podcast to talk about how to develop the scout mindset. Below is a transcript that’s been condensed for brevity and clarity.

Dylan Matthews

Walk me through what you mean by “scout mindset.” What does it mean to have it? How do you know if you have it?

Julia Galef

It’s my term for the motivation to see things as they are and not as you wish they were, being or trying to be intellectually honest, objective, or fair minded, and curious about what’s actually true.

By default, a lot of the time we humans are in what I call “soldier mindset,” in which our motivation is to defend our beliefs against any evidence or arguments that might threaten them. Rationalization, motivated reasoning, wishful thinking: these are all facets of what I’m calling a soldier mindset.

I adopted this term because the way that we talk about reasoning in the English language is through militaristic metaphor. We try to “shore up” our beliefs, “support them” and “buttress them” as if they’re fortresses. We try to “shoot down” opposing arguments and we try to “poke holes” in the other side.

I call this “soldier mindset,” and “scout mindset” is an alternative to that. It’s a different way of thinking about what to believe or thinking about what’s true.

Dylan Matthews

You have a lot of examples of “scout mindset” in the book, and one of my favorites was the French Colonel Georges Picquart in the late 19th, early 20th century. He’s a kind of loathsome guy in some ways, but admirable in others.

Julia Galef

So in the late 19th century in France there was what’s called the Dreyfus Affair. A memo was found in a wastebasket written by someone in the French army, addressed to the Germans, divulging a bunch of top-secret military plans.

The French army realized they had a spy in their ranks and launched an investigation. They quickly converged on this high-ranking officer named Alfred Dreyfus, who was the only Jewish member in the top ranks of the French army.

The officers who prosecuted Dreyfus genuinely believed that he was the spy. But their investigation, if you look at it from the outside, was incredibly slanted. They ignored testimony from experts who said that Dreyfus’s handwriting didn’t match the memo, and they only trusted the experts who said the handwriting did match the memo. So they convicted Dreyfus in this “soldier mindset”-filled investigation.

Dreyfus gets imprisoned on Devil’s Island. But then another officer gets promoted to the head of this investigative department. His name is Colonel Picquart and he is anti-Semitic, just like his fellow officers were. That was just kind of the norm in France at the time.

He didn’t like Dreyfus and he had all of the same biases that his fellow officers did. But he also had a much stronger drive to recognize and pursue the truth than his fellow officers did.

He started looking into the investigation that had been conducted into Dreyfus, he went through all this evidence and realized, wait, this is actually a really flimsy case. We just don’t have a strong case against this guy. I think we might have just convicted an innocent man.

His fellow officers just kept kind of dismissing his concerns and rationalizing away the inconsistencies he’d found. And this just made him really angry. And so he kept pursuing it and pursuing it. And it took many years, and the army actually tried to shut him up by putting him in jail as well. But eventually, Colonel Picquart managed to get Dreyfus exonerated and Dreyfus was reinstated back in the Army.

Colonel Picquart is a hero to me because, even though he was an anti-Semite, which, as you say, makes him kind of a loathsome figure, in a way that I think makes a scout mindset even more admirable. His love for the truth was so strong that it was able to outweigh his personal biases against Dreyfus and his personal biases towards preserving his job and his reputation and so on.

Dylan Matthews

When I first met you, you were doing seminars and workshops that were trying to help people notice their cognitive biases, think more rationally, and use better reasoning methods in their own lives.

In the book, you seem a little disillusioned from that project. You write that just telling people they have these biases isn’t enough for them to change that. They have to cultivate a whole different attitude toward the world.

What was the evolution of your thinking on this?

Julia Galef

Back in 2012, I co-founded this educational nonprofit called the Center for Applied Rationality, and I helped run it for several years. Part of what we did was run these workshops where we tried to take concepts from cognitive science, but also from basic economic theory and even philosophy, and use those concepts to help people improve their reasoning and decision-making in their own lives, their careers, their relationships, etc.

Originally, I envisioned this project as being about giving people knowledge. Like, “Here is the five-step process you should go through to figure out whether your action is net positive,” or, “Here is a list of the top 10 most common cognitive biases that impact our decision-making.” My assumption was that having that knowledge would equip people to make better decisions and so on.

It’s so often the case that when you try to describe a thing you were wrong about in the past, it seems kind of obvious, but if you think about the people you’ve seen online who know a lot of cognitive biases and logical fallacies, and you just ask yourself, “Do these people tend to be really self-reflective?” — I don’t think they usually do, for the most part.

The people I see who talk a lot about people engaging in cognitive biases and fallacies prefer to point out those biases and fallacies in other people. That’s how they wield that knowledge.

Even when you’re motivated to try to improve your own reasoning and decision-making, just having the knowledge itself isn’t all that effective. The bottleneck is more like wanting to notice the things that you’re wrong about, wanting to see the ways in which your decisions have been imperfect in the past, and wanting to change it. And so I just started shifting my focus more to the motivation to see things clearly, instead of the knowledge and education part of it.

Dylan Matthews

You talk about how our identities — thinking of ourselves as part of the liberal or conservative team, or the Christian team, or the feminist team — can make it harder to be a good scout. It makes everything feel more soldier-like, more adversarial, and can cause you to dismiss evidence because it’s inconvenient or embrace evidence that’s not very good because it helps your position.

Then at the end of the book you encourage people to think about being a “scout” as their identity. Isn’t there a danger that it will cause some of the same problems? Mightn’t self-proclaimed scouts be less likely to critique each other or be more smug about critiques from outsiders than if they didn’t see themselves as a unified community?

Julia Galef

The trick is choosing the things you’re going to pride yourself on strategically, so that the things that you’re rewarding yourself for, with pride or satisfaction, and the things that the people around you are rewarding you for, are things that are actually helpful for seeing the world clearly.

If you’re priding yourself on, say, “always having the right answer,” that’s an unhelpful kind of identity to have. That sort of thing is going to just incentivize you away from noticing when you’re not right. But if you define your identity carefully, and pride yourself instead on your ability to admit when you’re wrong, and on your ability to distinguish between different levels of certainty in your beliefs, then the incentives line up.

That kind of identity makes these kinds of habits and tools much easier to pick up and sustain because you’re actually feeling good when you use them instead of feeling bad that you proved yourself wrong.

Dylan Matthews

I’m a little skeptical that’s enough. Part of why I ask is we’re both part of the effective altruism community, which prides itself on using reason and evidence carefully in trying to figure out ways to do charity better, or improve government policy. Those are attributes to be proud of, but because I think of myself as an effective altruist I sometimes catch myself getting defensive or irrationally annoyed when other EAs are criticized.

Just to give a concrete example, there was a news cycle a few months ago about Scott Alexander, an effective altruist and writer you and I both like. I saw some critical coverage of him, and had some problems with the coverage on the substance, but I also felt this instinctive response of, “No, Scott’s a good guy, Scott’s one of us, we gotta protect Scott.”

That’s a bad impulse, and it honestly really scares me. How do you make sure that you’re not engaging in that kind of confirmation bias when you’re building out your scout identity?

Click Here:
Julia Galef

Let me ask you, how do you feel in those instances, when you do actually admit to yourself, “This point in defense of EA or whatever tribe I’m in actually doesn’t hold up,” or, “This critique of EA makes some good points”? In those moments, how do you feel?

Dylan Matthews

I feel sort of stressed briefly. But then … I think it over. And if I’m comfortable with the conclusion I come to, I can come to a kind of peace with it. Say I read something saying that effective altruists don’t take the idea of fighting for structural change to government seriously enough. I think, “Okay, they have a point. But structural change is really hard.” So I can integrate that critique into my old way of thinking. That brings me a kind of peace, when I can reconcile them.

Julia Galef

That’s pretty similar to my experience, too. I have a chapter in the book on how to make yourself more receptive to unpleasant or inconvenient truth, or things that might be true.

[My] advice is, before you try to think about whether the thing is true, first imagine that it is true, and then ask yourself, how bad would that be? What would I do? This certainly applies to real-world decision-making in tough situations like Steven Callahan [a sailor who was stuck on a raft in the Atlantic Ocean for 76 straight days]. He had to make plans for what he would do if the worst-case scenario happened. Just the making of the plans themselves can be kind of comforting. It won’t necessarily make the bad possibility palatable, but it can at least make it tolerable enough that you’re willing to think clearly about it, and consider if it’s true.

I think the same principle applies in the slightly less dramatic example of reading criticism of your tribe on the internet. Sometimes if I’m reading criticism and I’m feeling stressed out and defensive, and I notice that I’m reaching for rebuttals of it, I will just stop and imagine, “What if I found out that this critique was actually solid? How bad would that be?”

What I realized in that moment is, “I guess that would be okay. It’s happened before, and wasn’t the end of the world. Here’s what I would say on Twitter in response to the article, here’s how I would acknowledge that they made a good point.” This happens just in a few seconds in my head.

But just going through that exercise of, “How bad would it be?” and picturing the outcome often makes me realize, “Okay, it’d be fine. It’d be fine if this turned out to not support my side.” And then once I reached that state of acceptance, then I’m able to actually think about whether or not it’s justified.