Auren Hoffman 0:01
Welcome to World of DaaS, a show for data enthusiasts. I'm your host Auren Hoffman, CEO of SafeGraph. For more conversations, videos, and transcripts, visit safegraph.com/podcasts. Hello, fellow data nerds. My guest today is Julie Galef. Julie is co-founder of the Center of Applied Rationality. She's the host of Rationally Speaking, a podcast, and the author of The Scout Mindset, which is a book that I loved. Julia, welcome to World of DaaS.
Julia Galef 0:32
Hey, Auren. So great to be here. Thank you.
Auren Hoffman 0:34
Awesome now, okay. Now I want to dive in first to your book, The Scout Mindset. I love the book. In the book, you kind of present this idea that humans operate with either a soldier mindset, which is kind of like when we want to defend our beliefs. But there's this alternative scout mindset where we're motivated to learn the truth. Am I am I setting that up right?
Julia Galef 0:56
Yeah. No, that's a very great concise summary. I don't have anything to correct in it, but I will add a little bit to it, if you don't mind. So the scout and the soldier are my metaphors for what cognitive scientists might call, more technically and less colorfully, directionally motivated cognition and accuracy motivated cognition. So directionally motivated cognition, it's just like reasoning with kind of implicitly or unconsciously, a predetermined conclusion that you want to get to, that you want to defend, basically. And so, you know, we're often not really aware that we're doing this, but on some level, we're applying this kind of asymmetric standard to the ideas we're considering depending on whether we want to accept them or not. My favorite sort of short summary of directionally motivated cognition, or soldier mindset, comes from a psychologist named Tom Gilovich, who said that when we're engaging in directly motivated cognition, when we're evaluating an idea, we want to believe we look at it through the lens of “can I accept this”. Sort of looking for any justification to believe it. Whereas if we're evaluating something we don't want to believe, we look at it through the lens of “must I believe this”, and we're looking for any excuse to reject or dismiss it. So I call that soldier mindset, just because of the kind of the way that we talk about beliefs and argumentation in English is very militaristic. Like, we attack and defend ideas. We build up positions, you know, like military positions. We look for weak points in someone's logic, etc. So that's where the soldier mindset came from.
Auren Hoffman 2:29
Do you think we all have a bit of soldier and a bit of scout, and some people are a little bit more on scout, a little less on soldier. Your contention is the maybe we should nudge a little bit more to the scout. We’re never going to be 100% scout, but is that is that the idea?
Julia Galef 2:45
Yeah, that's absolutely the idea. And I'm glad you highlighted that because that's a common misunderstanding of what I'm saying. That some people are perfect scouts and other people are pure soldiers. So you get some people like patting themselves on the back for being perfect and shaking their head at those idiots soldiers out there who aren't me. And that was really not what I was going for in the book. What I think is much closer to the truth is that yes, as you put it, we all have some soldier and some scout in us. And so it's a spectrum. And we end up fluctuating on that spectrum depending on the day or depending on the topic we're thinking about. Some people might be better at being in scout mindset, about their jobs. You know better at thinking like a scout when you're evaluating the stock market and more of a soldier when they're thinking about politics or their relationships or something like that. Or, you know, maybe you're talking to someone who really just raises your hackles, and that puts you more in soldier mindset than you would if you're talking to someone who's much more kind of open and charitable, who you're better able to be a scout with. So there's a lot of variation, and individuals can fluctuate a lot. But nevertheless, I think some people are, on average, better at being scouts about difficult topics than other people are. And so part of the goal of the book was to ask, you know, what are they doing right? What can we learn from them?
Auren Hoffman 3:59
If you want to be a little bit more scout like, is there some things you should do before you start thinking about information? How do you nudge yourself to be a little bit more scout like?
Julia Galef 4:09
Yeah, I mean, this is the big question. I could talk for two hours uninterrupted on that one question. It’s like the main thing my book is about. So you know, hopefully, we can like talk more about this from different angles throughout the conversation. But to start with one key building block of being a better scout, I think, self-awareness. Like cultivating more of an awareness of the fact that you're often in soldier mindset, even though you don't feel like you are and starting to notice the kinds of mental moves that you do when you're trying to defend an idea.
Auren Hoffman 4:39
What's a good example of something where like you could see yourself moving a little bit more into soldier, and you need to like take a step back or something?
Julia Galef 4:47
Yeah. So a category of technique I often recommend is the thought experiment. So a simple example of what a thought experiment could look like, which I'm sure will be familiar to people, is suppose a politician on your side, like from the party you support, does something and is getting criticized for it by the public or the media. You think to yourself, “Oh, come on, that wasn't that bad? Why is everyone jumping on him? You know, this isn't a big deal.” A thought experiment you could do in that situation is to ask yourself, “Okay, suppose a politician from the other side, the side that I hate, did the same thing and was getting criticized for it? What would my reaction be then?” And you might notice, as I have often noticed in the past when I've done this thought experiment, is that the result is I realize, oh in that case, my reaction would be to call for his head. This offense would seem like a big deal to me. I would consider it not just a resignation worthy offense, but also an indictment of the whole party. Like this just goes to show how corrupt or incompetent the whole party is, etc. So, to generalize, the idea of the thought experiment is just to notice how you apply different kind of standards or like you're willing to be more or less charitable depending on your motivations or depending on the specific content of the issue. So that's like one example, but it can be generalized in lots of ways.
Auren Hoffman 6:14
It's hard when you have a team though, right? So if my team is the red team or the blue team or something, whether it's a sports team or a political team. In a sports scenario, like whenever the referee calls something, I'm going to be a little bit more biased toward my team, right? Is there any way to guard against that? Because I am rooting for my team to win.
Julia Galef 6:34
So in sports, like literal sports, I wouldn't worry too much about it. But in other more metaphorical teams, yeah, I think this is like one of the main causes of bias, of soldier mindset. So to some extent, I think just noticing, like just becoming more self-aware, and starting to notice oh, I am applying a different standard depending on whether it's my team or the other team. That on its own goes a long way. But there's another kind of category of techniques that are aimed less at self-awareness and more at making you more open to the possibility that your side might have done something wrong, or that you might have screwed up. And to give a short example of that type of technique, it consists in imagining the possibility that you might have been wrong or your team might have done something bad. Then before you try to consider whether or not that's actually true, ask yourself how bad would that be if that were true? So, for example, if I'm in an argument on Twitter and it starts to slightly occur to me that maybe they have a point or maybe I was too hasty in my original tweet or something, the temptation, of course, is to push that thought out of my mind and just look for ways to defend my original tweet. But sometimes, I managed to stop and do this little intervention and ask myself, “Okay, suppose I was wrong. How bad would that be, and what would I do about it?” Usually the result of this thought experiment is, “Okay, I guess it wouldn't be that bad. Like, I've been wrong before.” And what has happened as a result, usually nothing bad. Like maybe someone, and I feel fine about it. I think most of the time, the fear that we have of turning out to have been wrong is out of proportion to the actual bad consequences that happen as a result of admitting you were wrong. And so this kind of step is designed to help you notice that and thereby like relax and become more open to the possibility that you might be wrong. So I do this a lot as well. Does that answer your original question?
Auren Hoffman 8:51
Yeah, absolutely. Most people it does seem like are more reluctant to be wrong. There are a class of people that I've met that sometimes blame themselves too much about things. Is there also a way of thinking about it for those types of people?
Julia Galef 9:07
Yeah, it's so interesting. So as part of the process of writing the book, I was trying to think about why are some people so much more willing to be wrong than other people are? I think one of the differences is in what you think it means to be wrong. I think a lot of people, maybe most people, kind of implicitly assume that being wrong means you screwed up somehow. If you'd been doing everything right, if you'd been reasoning carefully, if you'd been a smart and reasonable and competent person, you would not have made this error. And so, of course understandably, people are reluctant to ever admit that they were wrong because that means like admitting fault in some way. And there's this different way of thinking about being wrong that I think is just more accurate and also more useful, which is that even if you're doing everything perfectly, you're still gonna be wrong about tons of things. Like we have limited information. The world is messy. We have limited time. So even a super genius spending a lot of time thinking about everything is still going to be wrong about a ton of things. This is actually part of the scout metaphor, like why I chose that metaphor to represent accuracy motivated cognition, thinking aimed to try to figure out what's actually going on. Because the scout, you know, is drawing a map of the terrain or of a situation, but the understanding is that this is just a provisional map. This is based on the current intelligence that I have. It's based on the investigations I've done so far. The expectation is that you're going to be revising that map as you learn more. As you go observe the terrain from a different vantage point, you might notice that what had seemed like a river is actually a dried up riverbed, but you couldn't see that based on your original vantage point, etc. You're drawing the map in pencil, not pen. So as you revise the map, it doesn't mean that you like did anything wrong with your first draft of the map. Sometimes being wrong means that you did something wrong. Like you were willfully in denial or like super careless or something. But I think more often you were doing everything right, and that still means you're going to get a lot of things wrong. T that's not your fault. So I think that's a really important mindset shift that makes it a lot easier to be a scout than people might think. It was a really astute point you made that there's these two different failure modes. One is people who never admit they were wrong, and they have super confident opinions that they will never change. And then the other one is people who are too afraid to ever form an opinion about anything for fear of being wrong. And those seem like very different modes, but they actually both, I think, stem from the same misconception about what being wrong means. That it means you screwed up somehow. And so like, some people will never admit it, and other people are afraid to even try, and both are bad. The better and more justified approach is to understand that like, you're inevitably going to be wrong a lot and that's not a sign of failure.
Auren Hoffman 11:55
To me it seems like one of the ways to become more of a scout is to encourage other people to point out things where you were wrong or where at least I think you might be a little bit wrong. But there's this trope of people that have like strong opinions loosely held. And I can imagine for many people, they're going to be reluctant to challenge people that have “strong opinions”. So how does one kind of like put those forward in a way where you're inviting some of these challenges?
Julia Galef 12:27
So I've used this phrase before. That it's good to have strong opinions loosely held. And I've since stopped using it because I think there's something really good and useful there, but it's so easily misunderstood or misused that I've sort of stopped using the phrase. But I'll tell you what I think that phrase should mean, and what I meant by it when I advocated it. So rather than the failure mode that I just mentioned a minute ago of being afraid to form an opinion ever for fear of being wrong. Like picture someone who always says, “Well, I'm not an expert on politics or COVID or whatever, so I really don't know. I have limited experience in business strategy. So I have no idea what we should do. No opinion, etc.” This is someone who is afraid of being wrong, and that's understandable. But what I would advocate instead is to form the best opinion you can with your limited information and time, and to be able to say things like, “Well it seems to me, based on what I've read or heard so far, that like masks are important or masks aren't important, or COVID is going to be a big deal, or COVID isn't going to be a big deal. Seems to me that like we should scale it fast, or we shouldn't scale it fast or whatever.” Form an opinion, and then hold it lightly in the sense of assuming that probably it will change as you learn new information. And that process, I think, it's just better for your sort of long term epistemic health to like form opinions and then revise them instead of never forming them at all. In addition to being more useful for like being able to actually make decisions and take action in the world instead of being paralyzed by uncertainty. So when you're communicating to other people, this is I think what your question was getting at, I know some people who do this really well, who do the strong opinions loosely held thing. They'll say, “My current impression is, strong opinion.” Then they'll emphasize like, “But you know, that's based on like limited data, and that could well change.”
Auren Hoffman 14:33
Or they put a percentage on it. Like I'm 62% confident in this thing or something.
Julia Galef 14:39
Yeah, yeah. I mean I do think this is really good. Both so that other people feel free to push back on you, as they should. Also just kind of I think it's good for like the epistemic health of the community, whatever that community is. Your scientific community or your social group or your team at your company. Like if you overstate your opinions, the strength of your confidence in your opinions for whatever reason, like if you want to sound confident or whatnot. I think you're kind of like inflating the plausibility of that hypothesis in other people's minds in a way that's not actually justified. And I think that's kind of bad for group reasoning. So a lot of reasons to do that.
Auren Hoffman 15:20
I personally have a hard time with is just being wrong in the moment. I'm maybe a little bit better with thinking my past self was completely wrong. Is that a common thing that people have? Or how do people like think about like when am I going to disagree with myself? And how far in the past am I willing to go to disagree with myself?
Julia Galef 15:43
Yeah, that's such an interesting phenomenon. I agree with you. There's a bunch of different ways that I try to get people to be more open to the idea that they might be wrong in the moment. One of the ways is asking them whether they've been wrong in the past. Often that gets people to realize like, “Oh, yeah, I have been wrong in the past. That should make it seem more likely that I am wrong about this in the present.” But it's an imperfect strategy. It still kind of intuitively feels like, “Yes, of course, I've been wrong in the past. But this current issue, I'm 100% positive I'm not wrong about.” I don't have a perfect solution to that. Like that intuitive feeling can be very strong. But I think at least what you can do if you acknowledge at least intellectually there's some chance I'm wrong, even though it really doesn't feel that way. I think that at least it can motivate you to seek out potentially disconfirming information or talk to people who might have a different perspective in a way that you wouldn't if you really were positive you were right, both emotionally and intellectually. So I sometimes talk about that in terms of separate inside view and outside view. Like my inside view is that I'm positive I'm right. My outside view is that well I might be wrong. And the outside view can motivate me to get more information instead of cementing my opinion as it currently is.
Auren Hoffman 17:06
There's a sense of like some things you could have been right about before, let's say how to recruit software engineers into your company. There was like a specific good way of doing that in the past, and maybe in the present, it no longer works as effectively as it did in the past. Like, how do you get better at understanding okay, this was true in the past. Like, I feel very confident during the past, but now it no longer is true.
Julia Galef 17:31
Yeah, I think that's a common situation. In fact, I think that makes it easier to notice that you're wrong because it's one of the many examples in which being wrong is not your fault. Because you were you were right before. It's just that the world's changing and you need to update your view. To me that feels much less like an ego threat than the version of wrong where you were wrong all along. I don't know how it is. Is that the case for you?
Auren Hoffman 18:00
Definitely, definitely. If I was wrong all along, I probably feel more badly about it or something. For me, to be wrong about a political belief is, it's not that big a part of my identity. I don't usually write about political beliefs. So if I end up changing my mind about minimum wage or something, it doesn't really affect my identity. But then there are other beliefs I might write about or something where my identity might be a little bit more wrapped in it. How do you figure out how to change those beliefs or how to update those beliefs?
Julia Galef 18:34
Yeah, I appreciate you made that distinction. Because I think even though a lot of people are aware of the idea of beliefs getting wrapped up in their identity, they're often only thinking of it in terms of like political beliefs or religious beliefs. Those are indeed common examples of issues that become parts of our identity in the sense that when someone disagrees with us, we feel like personally insulted or outraged. Like the honor of our tribe has been called into question or like someone stomped on our flag or something like that. But like politics and religion are just two of the most particularly prominent examples. Literally anything can become part of your identity in the same way. Like which programming language you think is the best. People can get into very identity laden arguments about that. I think it's great that you have this awareness of which kinds of topics are a sort of part of your identity. One of the things I was trying to do in the book was give some suggestions for clues that a particular topic might be more part of your identity than it is for other people. Like do you notice yourself feeling compelled to jump in and defend some particular ideology when it's being criticized online, even though they're not like talking directly to you? Or do you feel—A sneaky one I think is there are often groups ideologies that we—These are kind of negative identities where we really dislike those groups and ideologies. So we're going to be especially motivated to believe anything that kind of undermines or makes those groups or ideologies look bad. So if you really hate hippies, for example, then anything that seems to show that like hippies are wrong about politics or hippies are stupid or whatever. That's going to be a very appealing belief for you, right? But it's not like you yourself are part of some anti-hippie political party. So there's no concrete, positive identity that you have that is going to be apparent to you as like oh yeah, I'm a liberal. So I better watch out for how being a liberal might affect your beliefs. The effective identity is operating through your dislike of a different identity, but it can still be just as warping like. I don't know. A different example might be if you really hate the tech world. This probably wouldn’t apply to a lot of your audience. But if you really hate the tech world, you're going to be especially motivated to believe anything that—You know, a new story about some tech company being corrupt or about tech making the world worse or things like that. Which is not to say that those things can't be true, but you're just going to have a special motivation to believe them whether or not they are true.
Auren Hoffman 21:19
Okay, so let's say like your book got assigned to everyone in high school. Now we're able to move the entire country or the entire world to be 30% more scout-like in the future. Are there any negative consequences you think could come from that?
Julia Galef 21:35
Yeah. Yeah, I can think of several potential downsides to that. One potential downside of becoming more scout-like is if you do it kind of incompletely or imperfectly, you could end up worse off. So, for example, suppose you're deluding yourself in two ways. First, you're deluding yourself into thinking that the company you're starting is guaranteed to succeed. Second, you're also deluding yourself into thinking that if you failed, that would be absolutely terrible and devastating. Now, suppose you read my book and you're like, “I need to be a better scout. I need to be more truth seeking and confront the things I'm wrong about.” And you do that with just the first belief. So you now recognize that, “Okay, actually I'm not guaranteed to succeed. That was a delusion.” But you still retain the second false belief that failure would be devastating. So now you're much less motivated to take the risk of starting your company because you think like, “Oh, I might actually fail, and that would be absolutely terrible.” So you don't even try.
Auren Hoffman 22:38
So that 30% might be unevenly distributed, which almost certainly it would be. So in some cases, you become like way more scout mindset. Then that would lead to more inaction in society or something.
Julia Galef 22:50
Yeah. So I predict that if everyone was actually being a really good scout, that this wouldn't lead to more inaction in society. Because yes, people would recognize that things are risky, but they would also recognize what I think is true, which is that we tend to be more scared of risk than we need to be. Not in all cases, but in a lot of cases. That like the actual bad consequences of a lot of risks aren't as bad as they seem to us. I'm just saying it's totally possible that people might… If you're not doing it perfectly or you're doing it incompletely or whatever, you could end up worse off if you still have some false beliefs, but you've gotten rid of others.
Auren Hoffman 23:29
There's some meme that like founders are risk takers, but most founders I know hate risk.
Julia Galef 23:35
Oh, yeah? Like more than the average person you think?
Auren Hoffman 23:38
I think so. I think they feel like they take maybe calculated—again, this might be my bias— calculated asymmetric bets. How do you see like this founder mindset play into the scout mindset?
Julia Galef 23:51
I've talked to a lot of founders, partly just from having lived in the Bay Area for years. But also as part of a process of researching the book, I interviewed a lot of people about what it was like starting companies and how scout and soldier mindset played roles for them and so on. It does seem to me like there's two different approaches to the hard and risky endeavor of starting a company. One is the kind of archetype of the founder that like the public might picture, which is someone who tries to banish the thought of risk from their mind and just cultivate a supreme certainty that they are going to succeed and their company is going to be the next Google or whatever. I mean, denial essentially, denial of risk is their kind of coping strategy. Then the other type is maybe the one you're describing where they're very aware of risk, but they use it kind of carefully and strategically to take calculated risks. So they might come up with a plan for how they're going to like reduce risk as they go on. Or they might reason, “Well this particular strategy might only have a 30% chance of success, but I think we have the runway for like five attempts. So like five attempts, each with 30% chance of success, is whatever—” I can't do that in my head, but you know what I mean. So they're acknowledging the existence of risk and just being kind of strategic in how much they're taking on, and making sure that the risks are good risks where like the expected benefit of the risk outweighs the potential downside of the risk and things like that. Personally, I think the latter is a healthier, and more conducive to success way of thinking about risk than the former where you're just kind of denying it and trying to act on sheer enthusiasm and will alone.
Auren Hoffman 25:53
You mentioned earlier that there are some people that they may be scouts in one area of their life, as you mentioned let's say they're an investor or something, and they're not scouts in other areas of life. Is it more likely if you're a scout in one area that you're going to translate that to another area? Or just people compartmentalize the different types of things in their life and just because they use scout mindset as an investor or poker player or something, they don't necessarily use it in their relationships or politics.
Julia Galef 26:24
Both are relevant. On the one hand, there's this very domain specific degree of scout mindset. And I think that it's a function of a number of things. Like how much of your identity do you have invested in that particular domain? So maybe you have a lot of identity invested in politics and not as much in which stocks are going to go up and down. And also, I think it can be determined by things like the incentives that you face. I often find that people in quantitative finance are very scout-like, at least in that domain, because there are real stakes. Like you have a real incentive to try to see what's actually going on and change your mind when the situation changes or test your hypotheses about the market and update them if it turns out they were wrong. Because you get a lot of money if you're right, and you lose a lot of money if you're wrong.
Auren Hoffman 27:16
I assume it's the same for poker players as well.
Julia Galef 27:19
In fact, yeah. I think people in quantitative finance and poker players are like overrepresented among the best scouts that I know. Which is not to say they're perfect or that everyone in those fields is really good, but like on average. So yeah. I think there are these domain specific factors that can affect how likely you are to be a scout or a soldier in a particular domain. Then I think there was also this kind of general factor of a scout mindset that's more determined by traits of that person than about the particular domain they're thinking in. The factors that play into that are things like intellectual curiosity. How much do you enjoy the process of figuring out what's actually true? Just a general kind of equanimity? Are you easily thrown into kind of anxiety and anger or not? Then also just your values. The people I know who are really good at scout mindset across a wide variety of domains, they tend to just really value it and tend to pride themselves on their intellectual honesty or objectivity or things like that. So even though they still face the temptations of soldier mindset, like they face the temptation to defend themselves publicly against criticism or to avoid thinking about inconvenient truths. They have this countervailing factor that helps them say no to those temptations, which is that they can feel proud of themselves when they resist those temptations and are able to change their mind or acknowledge good point by their opponent or things like that. So those are some of the things that I think determine like the general factor of scout likeness.
Auren Hoffman 29:03
Really interesting. Now, I want to ask you a couple questions on podcasting since you're on a podcast. It gets a little meta here, but I love your podcast Rationally Speaking.
Julia Galef 29:11
Auren Hoffman 29:13
It's one of my favorite podcasts. I'm always constantly recommending it to other people. One of the things I really like about what you do in your podcast is that you often are kind of presenting your guests with like a steel man argument and then engaging with them through that. Is that a specific tactic that you use? It seems like it would be a lot of work to go to it.
Julia Galef 29:40
I do prepare a lot, some might say overprepare, for my podcast conversations. A lot of that is just I find that it takes a lot of work to understand a field or like an issue well enough to know what are the good questions to ask. Yeah or to be able to recognize when something the guest says about the field is not actually necessarily an accurate representation of the field. So I want to push back on them, things like that. That actually takes a lot of preparation. Like it's not even enough to just read their book. You have to kind of like read different perspectives or read reactions in the fields of the book or things like that. So, yeah. It takes a long time. I've been trying to find ways to reduce the amount of preparation time needed for a given episode. But no, it still takes me many hours.
Auren Hoffman 30:33
For this particular podcast you're on, I already read your book before I invited. I've been listening to your podcast for years. I've read a bunch of things you've written over the years. So the prep for me was not that hard, in some ways, because I've already done the 50 hours of prep just over the last few years just by default. So then it was easy. I just invited you on. But I can imagine going the other way where you invite someone you think might be interesting on first, and then they say, “Oh, sure.” Then you have to do the 50 hours of work. Like which way do you go?
Julia Galef 31:08
That is definitely a good solution is to invite people to talk about things you're already pretty familiar with. I mean, this is one reason that on my podcast, I'm more likely to invite people who do social science than, say physics. Because it's not that I think social science is more important than physics, but I am already familiar with at least like the methodology of social science. I'm familiar with how social scientists go about trying to answer questions. So if a social scientist claims something about like, “Well we ruled out these hypotheses, and so we can be confident that XYZ.” I'm going to be able to ask like, “Well, how did you rule them out? Because I know it's quite hard generally to rule out alternate causal stories. So I want to push back and understand why you think they can be ruled out. The methodology in physics is so much more outside of my wheelhouse that I wouldn't really know how to challenge my guests on their claims.
Auren Hoffman 32:03
So instead of 50 hours of prep, it could be 500 hours or something.
Julia Galef 32:07
Yeah. I mean the two factors that make an episode especially time intensive for me and make me more reluctant to take it on are: A, it's in a field where I'm not super familiar with the methodology; and B, it's on a kind of controversial subject where I know that if the guest says something that's kind of wrong or oversimplified or exaggerated and I let it slide then my audience will be mad at me. They'll be like, “How could you not challenge that?” And also, of course, I hope that my podcast increases the average level of truth in the world and not decreases it. So I don't want guests to be able to spread views that are wrong without me at least registering that I disagree. So yeah. Those two factors are going to make—They won't necessarily make me turn down a potential guest, but at least makes it a bigger endeavor to do a podcast on that.
Auren Hoffman 33:03
If you were gonna give advice to a new podcaster, like myself at World of DaaS, is there some sort of like common advice that you could only learn after doing it for a few years?
Julia Galef 33:13
I guess one piece of advice that took me a little while to notice is that there's often a temptation to just invite guests who are already kind of well known. You know, public intellectuals or authors or pundits. That has advantages and disadvantages. An advantage is that they're very practiced at giving articulate and interesting or entertaining answers. I guess another advantage is that can increase the size of your platform because they'll share your podcasts to their many followers. So that's an advantage. A disadvantage that I hadn't really noticed at first is that they often aren't very good at something I highly value in a podcast guest, which is a willingness to actually think about the questions on the spot and respond to the actual question that is asked. Instead what they do, kind of understandably if you're giving a lot of interviews, is they'll kind of answer with the talking points that they're already used to giving. Which may be very interesting and intelligent talking points, but what I want in a podcast is to like have an actual conversation where they're listening to me and they're responding to my questions, and I'm listening to their responses and responding to them, etc. So I often find that's harder with like famous people.
Auren Hoffman 34:22
Also from the audience, like if they've heard that famous person a few times, like you're right if they're very practiced, then you might not learn anything new because they probably are saying things they've already said before.
Julia Galef 34:33
Yeah. Or like they're optimizing for something else besides having like an interesting conversation. Like they're optimizing for adding to their public image or brand or something or being entertaining maybe. I don't mean to criticize all famous public intellectuals. There's like a lot of great ones out there, but it's just like a pattern that I've noticed.
Auren Hoffman 34:52
Okay. So I want to get doubly meta now. So if I'm trying to have a scout mindset about being a host of this particular podcast, like what advice could you give me to improve being a better podcast host?
Julia Galef 35:08
Oh, gosh. Well, I mean I've enjoyed the conversation a lot. Nothing like jumped out jumped out at me is oh he shouldn't have done that. I mean I guess a thing that I wish podcast hosts would do in general is like disagree more with their guests, like in amiable and charitable way of course. But as a listener, I often find that the most like interesting parts of the conversation are where the host is talking about either a disagreement that he or she has with the guest and trying to understand why they disagree, or talking about a disagreement the guest has with someone else. Like in my most recent podcast, I had the behavioral geneticist, Kathryn Paige Harden. Towards the end of the episode, we talked about a disagreement between her and her PhD mentor, Eric Turkheimer, about whether or not it's meaningless to ask whether there might be genetic causes of differences between racial groups. So anyway, I feel like the most interesting issues are kind of in these areas of non-overlap where like different experienced and smart intellectuals who have studied a topic come to different conclusions on it. Those are the areas that I want to dig into the most. So I always wished podcast hosts would talk more about disagreement.
Auren Hoffman 36:30
One of the things I like about your podcasts that you're willing to go in these areas that might seem “controversial”. You might not even have an opinion about it, per se, but you're wanting to learn a little bit more about it. You seem to navigate these fields really well, at least from my standpoint. In many cases, it seems like you're willing to give the benefit of the doubt to the person and walk through it. At least personally, I don't know that I would be wanting to delve into these more controversial topics. How do you see that as part of your duty as a podcast host?
Julia Galef 37:06
So I'm glad you think that I do a good job of it. I've taken tentative steps into talking about controversial things because I know it's hard. I guess one thing that I do that I think helps is to talk about controversial issues with a couple layers of remove added. So like instead of trying to share my own opinions about like, I don't know IQ or something. I'll instead ask a guest about the causes of her disagreement with another person about IQ. So there's like three layers of remove added there. I still think it's like an important and interesting angle on the subject, like more interesting than my own personal opinions would be actually. People are much less likely to get angry if you're analyzing something at like a meta level, as opposed to the object level. That's one thing. It also helps to just like use a lot of nerdy language like meta and priors and stuff like that. That makes people's eyes glaze over, and so then they don't get mad at you.
Auren Hoffman 38:15
Got it. I guess you could do it like three fourths of the way into the podcast. So anyone still listening there is gonna be a real true person.
Julia Galef 38:22
Right? I mean also probably even more important than the two things I mentioned is to actually have a track record of, you know, not being not being a soldier. There are some people who they say that they just want to ask questions and get to the truth. But in practice, if you look at the guests they choose and the conversations they have, they have a couple hobbyhorses that they just really want to harp on about. They'll invite guests to like challenge the consensus on those hobbyhorses, but they're not that interested in other topics or other sides. And so, I think people are more likely to trust that you are genuinely like asking questions and trying to figure things out if you have a track record of doing that.
Auren Hoffman 39:04
Alright, a couple personal questions before we let you go. So you’ve been very involved in the “rationality movement”. How do you see that evolving over time?
Julia Galef 39:14
Yeah, so just a bit of context. The rationalist, I guess community is a little more accurate than movement. Maybe I've said movement in the past. I don't know. It's kind of started online in maybe like the late 2000s, like 2006/2007.
Auren Hoffman 39:33
LessWrong blog and some of these other things.
Julia Galef 39:35
Yeah. So it grew up around two blogs basically. Overcoming Bias and then Less Wrong, which I guess spun off of Overcoming Bias. And it was basically just a group of people like not officially affiliated or anything. No official name, but like a group of people who liked having discussions on these blogs about rationality, which has a specific meaning in this context which is different from the colloquial meaning. So it's worth highlighting that. The specific meaning is the term as it's used by academics. So epistemic rationality is the art of forming beliefs that are as accurate as possible, essentially what I'm writing about in The Scout Mindset. And then instrumental rationality is about making more effective decisions that more effectively achieve your goals, whatever those goals are. So the rationalist community was just a bunch of people who were really interested in talking about epistemic and instrumental rationality, both kind of on an abstract philosophical level. Like how do you define truth or accurate beliefs? Also on a practical level, like talking about particular scientific issues and what's the best way to think about this? Troubleshooting common biases that make it hard for them to achieve their goals, things like that. So it doesn't mean like a bunch of people who think they're rational and other people aren't. But I can see why people would assume that from the name. So I forever find myself having try to correct that misunderstanding.
Auren Hoffman 41:03
If I thought of like the crypto community and I said okay, who's your deity or your god? Maybe they were to point to Satoshi. In the rationalist community is it Bayes? Is that the person everyone turns to?
Julia Galef 41:16
People will sometimes joke about like the Reverend Thomas Bayes. I mean yeah, he was a reverend hundreds of years ago who he just he formulated this very simple theorem and probability, Bayes Theorem or Bayes Rule. It's a very basic probabilistic theorem. If you're trying to be Bayesian, then you kind of try to keep that rule in mind as a guideline for how you should be updating your beliefs as you learn new information. That is definitely one of the kind of core principles underlining, especially the discussions of epistemic rationality. Like how to form accurate beliefs about things. Of course there's plenty of room for disagreement about like specifically how should a Bayesian update in light of this new information or that new information? So it's not like a perfect template for what you should believe, but it is kind of a guideline for the structure of how an ideal reasoner would think. Then you can use that to try to try to evaluate your own thinking.
Auren Hoffman 42:20
Okay, this is really good. The last question we ask all of our guests is what is the conventional wisdom or advice that maybe is generally bad advice?
Julia Galef 42:29
I mean, one is something that we kind of talked about earlier in the conversation, which is advice often given to founders. That in order to motivate yourself to do this hard thing, you should banish all thought of failure from your mind and just try to believe 100% in your success. I think this is it can motivate people, it just also comes with this big downside that you're kind of crippling your ability to think clearly about the pros and cons of different options and weigh risks against benefits and costs and just like choose the path with the best expected value. It's really hard to do that if you're not letting yourself think about risk at all. And something that I think people often don't realize is that some of the most successful founders in history completely flouted that common wisdom. So, for example, one person I talk about in my book is Jeff Bezos, who early on when he was like just deciding to leave his job on Wall Street to start the company that would become Amazon. He explicitly thought about risk to himself and tried to estimate the risk that his new company would fail. His best guess was that there was about a 70% chance that his company would fail. But he was he was fine with that risk because he was like, “Well, I think I'd much rather try to take the risk and fail and then be proud of myself that I tried this hard thing than to never take the risk. So I feel good about that.” But he was explicitly recognizing that what he was doing was risky. He's kind of like an existence proof that you can acknowledge to yourself that what you're doing will most likely fail and still be really motivated and hard working to try to make it succeed in spite of the odds. And there are a bunch of slightly less high profile examples of successful founders who were honest with themselves about the risks they were facing, and didn't let that get in the way of their determination to try to succeed as best they can.
Auren Hoffman 44:30
Right. And even in Jeff Bezos’ case, the risk was pretty muted, right? So let's say he failed after a few years. So he would lose that really nice salary had at D.E. Shaw for those few years, but he would learn something really interesting and he almost certainly could have got his job back. So his risk was actually pretty small. The risk for not doing in some ways was larger because you would have always been asking yourself what if I did this or I had this dream or etc.?
Julia Galef 45:01
That's right. Yeah, I think people often think about risks just in terms of these very specific, measurable things. Like I'll lose this amount of money that I invested, or I'll gain this amount of money if my company succeeds or something. But there are all these kind of messier costs and benefits that can be just as important as the measurable monetary ones. Like, yeah, the experience that you'll gain from doing the thing even if it fails, or the connections that you'll make from doing the thing even if it fails. Or the prestige or the stigma of being a founder, depending on how you look at it. Or just like the consumption good angle on it all. Like would you actually enjoy it? Or would it be miserable compared to working a regular job? All of these factor in and admittedly makes it very hard to calculate the expected value of starting a startup versus working on Wall Street. There's no real objective way to do that. But you can do kind of some rough like back of the envelope just off the cuff estimates of whether the expected value is more positive for starting the startup versus staying in your current job. And I think trying to do those rough best guesses is much better than not trying to do them at all.
Auren Hoffman 46:19
There's one thing maybe to do that in starting a business or investing in a particular thing. How do you think about people should apply this in their personal life? Like, should I get married? Should I have kids? These other kinds of questions that you maybe your gut is pretty good for. How do you use one versus the other?
Julia Galef 46:37
There's no right formula for making these decisions. But I think the kinds of thought experiments that I was alluding to earlier where you—Well I guess the one I gave was a political thought experiment, but thought experiments also worked for personal decisions. If I'm telling myself this person is going to change, and even though we have these unworkable problems now, I'm going to take the plunge because I'm sure I can change him. A thought experiment you can do is to imagine that a friend of yours was in the same situation with the same boyfriend with the same problems and this was her plan. How would you feel about that plan? Would you feel optimistic or not? And often removing yourself from the situation and trying to look at it as if it were someone else, you can end up having a very different reaction than you had when it was you in the situation.
Auren Hoffman 47:32
Someone once told me they try to think of themselves as like their cousin. Like someone they care a lot about, but it's not even their brother or sister.
Julia Galef 47:42
That’s nice actually. I was just musing about this on Twitter recently. A common version of this advice I hear is that imagine your friend came to you with this problem, what would you tell them? I think it's kind of close to the advice I would give, but maybe not ideal. Because I think even though yes, we have a bias, often in favor of putting on rose colored glasses in our own life, or sometimes in favor of beating ourselves up over things that we wouldn't beat other people up over. If you're talking to a friend, you often have a different bias that can also be distorting where you don't want to tell your friend anything harsh or negative, and you just want to be encouraging. So I don't necessarily think that thought experiment is great at giving you your best honest, objective picture of the situation. So I like the cousin example because it's someone who you're close enough to that you actually care about what happens to them, but maybe far enough removed that you're able to sort of think more objectively about their situation and their prospects. So yeah. There may be some kind of happy medium person that you should be imagining in that thought experiment.
Auren Hoffman 48:53
This has been awesome Julie. Thank you so much. You mentioned Twitter. I love following you on Twitter. Can you tell folks where to follow you on Twitter or anywhere else?
Julia Galef 49:00
Oh, yeah. I'm just Julia Galef on Twitter. It's Julia. My last name is G-A-L-E-F. And yeah, please come join my musings on Twitter about this and other subjects. My book is The Scout Mindset. My podcast is Rationally Speaking. That's at rationallyspeakingpodcast.org. Then my personal website is just juliagalef.com.
Auren Hoffman 49:22
Perfect. All right. Well, thank you very much. This has been awesome.
Julia Galef 49:24
Oh, my pleasure. So great talking with you, Auren.
Auren Hoffman 49:28
Thanks for listening. If you enjoyed the show, consider rating this podcast and leaving a review. For more World of DaaS, you can subscribe on Spotify or Apple podcasts or anywhere you get your podcasts and also check out YouTube for videos. You can find me at Twitter @auren, and we'd love to hear from you. World of DaaS is brought to you by SafeGraph.
Auren and Julia explore building a scout mindset as defined in Julia’s new book, why embracing being wrong is important and tactical approaches to shifting your mindset. They also cover how entrepreneurs approach risk and how the scout mindset manifests in unique ways across different professions.
Dan Doctoroff, Founder and CEO of Sidewalk Labs and former CEO of Bloomberg joins World of DaaS host Auren Hoffman. Dan was also formerly the Deputy Mayor of Economic Development for New York City during the Michael Bloomberg administration and Managing Partner at the private equity firm Oak Hill Capital Partners. Auren and Dan dive into how cities are formed and how they can leverage data about the physical world to operate better. They also cover Sidewalk Labs’ unique business structure (including its affiliation with Google’s parent company Alphabet) and how Dan thinks about incubating and forming new businesses.
This episode is dedicated to those who suffer from ALS. We encourage listeners to make a donation to Target ALS at: https://www.targetals.org/donate/
Spotting talent is really hard. Identifying A-players can feel impossible. Peter Thiel has one of the best interview questions for identifying talent, “What important truth do very few people agree with you on?” But Daniel Gross disagrees. Daniel believes easygoing questions like “What movies do you like to watch?” elicit more telling responses.
Daniel Gross is the CEO of Pioneer, a reimagined version of the startup accelerator focused on identifying, motivating, and enabling the next wave of founders. Daniel previously founded Cue, which was acquired by Apple, and was a partner at Y Combinator. Daniel co-authored with Tyler Cowen the soon-to-be-released book, Talent: How to Identify Energizers, Creatives, and Winners Around the World. Simply put, Daniel is an expert on spotting talent.
Auren and Daniel dive into Daniel’s favorite interview questions, how to distinguish between good and great employees, what makes a 10xer, and how to measure productivity. They also explore why the strongest leaders are energetic, enthusiastic, and funny.
Tyler Cowen is Professor of Economics at George Mason University, host of the Conversations with Tyler podcast, blogger at Marginal Revolution, author of several books (including one my personal favorites, the Great Stagnation).
Tyler is one of the very few truly committed to constantly learning. He also reads 5-10x faster than a fast reader, so his superpower is consuming large amounts of information.
Auren and Tyler cover how the last year drove the end of the Great Stagnation, society’s newfound appreciation for big business, why Tyler thinks economists’ use of data is overrated, how to spot talent, why organizational capital would be one of the most valuable data sources, and so much more.