Hacking Heuristics

Blair leads a discussion on how clients tend to take mental shortcuts in making business decisions, and how we can nudge clients without manipulating them to make a decision that is in their best interest.



Rory Sutherland

Influence: The Psychology of Persuasion and Pre-Suasion: A Revolutionary Way to Influence and Persuade by Robert Cialdini

Pricing Creativity: A Guide to Profit Beyond the Billable Hour by Blair Enns

Nudge: Improving Decisions About Health, Wealth, and Happiness by Richard Thaler and Cass Sunstein

"The Dark Arts of Leveraging Cognitive Biases" by Blair Enns

Thinking, Fast and Slow by Daniel Kahneman

"Pricing Creativity" 2Bobs episode

Predictably Irrational by Dan Ariely

Richard Feynman

Dunning-Kruger effect



DAVID C. Baker: Blair, are you ready to talk about all of your biases? 

BLAIR ENNS: Oh, yeah I actually love talking about my biases. I don't think we put our biases ... I'm going to say it the proper way, biases. I don't think we put our biases on the table often enough. Yeah, so I'm really excited about this. 

DAVID: You're a very biased person I've noticed and so we're not going to get to my biases today. 

BLAIR: Yeah, okay, I'll lie on the couch today. 

DAVID: Okay. Yeah this is really an intervention. A lot of the people close to you in your life have asked me to have a podcast where I interview on your biases so just be prepared. 

BLAIR: That explains the crowd outside the door.

DAVID: Right. 

BLAIR: So you thought about this topic and, let me get inside your head, what was it that made you think this will be interesting to ...

DAVID: What a loaded way to ask that question. 

BLAIR: What possibly possessed you ... No, what made you think that this was really interesting because as I was starting to read more about it, because admittedly you know so much more about this than I do, I was really interested in it too. But talk to me about how our listeners are going to be interested in this and why.

DAVID: The subject is really cognative biases and other forms of biases. We're going to put them all under that banner of cognitive biases. Some of them are actually social biases. Behavioral economics is the new pop psychology. I don't mean to kind of demean the ... I won't call it science, although it's a social science. Lets call it a social science. I don't mean to demean it by referencing it as pop psychology. There's just so many books written on it. To me advertising as a profession, and I spent a lot of years working in it, just became so uninteresting for so long until people started applying some principles of behavioral economics to advertising. That's when it became interesting again. 

DAVID: We've got people like Rory Sutherland, who is the vice chair of Ogilvy UK. He's probably at the fore front of bringing behavioral economics to the world of advertising. Now you see it everywhere, Robert Cialdini, the author of Persuasion and a few other books. Pre-Suasion is his latest, I haven't read that one yet. So there are all kinds of people out there bringing behavioral economics to advertising and the other creative professions. You just see it everywhere in business. 

DAVID: In my last book, Pricing Creativity: A Guide to Profit Beyond the Billable Hour I had a chapter on essentially leveraging cognitive biases. Lets put the 10 dollar word on the table shall we? 

BLAIR: Heuristics?

DAVID: Yes. 

BLAIR: Is it heuristic or is it "yeuristic?"

DAVID: I don't know.

BLAIR: I usually split the difference on the H and I try to do a half of an H. Heuristic. 

DAVID: Pretend to cough while you're saying it so if somebody thinks you're not saying it right. Right so heuristics, okay. 

BLAIR: A heuristic is essentially a mental short cut. It's really a practical approach to solving a problem that employs mental short cuts. A rule of thumb, educated guess, common sense, when people reference those things they're really talking about heuristics. Heuristics are things that essentially allow us to reduce the cognitive load in decision making. We don't have to employ all of our brain power to answer some questions because some questions are easily answered through these kind of mental short cuts. 

BLAIR: Sometimes the idea of leveraging cognitive biases in decision making in business, it's often known as choice architecture. How we kind of architect the choices that people end up making. It sounds a little nefarious doesn't it. The idea of choice architecture. 

DAVID: It does a little bit. Sounds a little manipulative, but in this as you talk ... We're going to go through a whole bunch of these. As you talk about how you might recognize the short cuts that prospects are making, you are going to suggest some ways that you can gently nudge but stopping short of manipulation. The other interesting thing as we talk through these is just to start to recognize how are taking mental shortcuts, sometimes inappropriately. It's going to be a little bit of self examination as well as we go through these. 

BLAIR: Yeah and you used a really important word, a key word, nudge. I think it was Thaler, Richard Thaler who coined the term. I'm holding his book right here, Nudge, improving decisions about health wealth and happiness, Richard Thaler and Cass Sunstein. Cass Sunstein is a law professor at Harvard or was up until recently and then Thaler is known as one of the fathers of behavioral economics. So the idea of choice architecture and nudging goes hand in hand. If you look at it magnanimously, how do you nudge them to make a decision that's in their best interest and if you look at it kind of nefariously, how do you nudge them to make a decision that's in your best interest. 

BLAIR: When I first wrote about this a couple of years ago I wrote an article, you can find it on winwithoutpitching.com. It's called "The Dark Arts of Leveraging Cognitive Biases." I really like that title, The Dark Arts, but I got a bunch of blow back hate mail and accusations of condoning the manipulation of others. When we get into some of these specific biases and how to leverage them you'll see that we're never really without bias. We're already subject to our own biases. We're subject to the biases of others and we intuitively sometimes tap into those biases when dealing with others. 

BLAIR: I want to give you an example of a heuristic approach to problem solving. I'm going to give you a puzzle and I want you to give me the answer. If a baseball bat and a ball cost a total of a dollar and 10 cents ... So baseball bat and a ball cost a dollar 10, and the bat costs a dollar more than the ball then how much does the ball cost. 

DAVID: Well immediately I'm saying 10 cents, right, but I think you probably just tried to embarrass me in front of tens of thousands of people but yes I'll play along. Yes, 10 cents. 

BLAIR: Do you want to think about it more? 


BLAIR: That's the first answer. We think well we just make this approximation. Daniel Kahneman's book Thinking Fast, and Slow talks about these two systems of thinking, system one and system two. System one is where we leverage ... We use these mental shortcuts, rules of thumb, et cetera to come to these decisions. The answer it the ball costs five cents and the bat costs a dollar five but almost nobody answers that. Their first reaction is well okay we're talking about a dollar difference here. A dollar ten total so 10 cents and a dollar. The ball must cost 10 cents but you think of it and you think, well that's a 90 cent difference. That's not a one dollar difference. I want the math that gives me the bat costing me a dollar more than the ball. 

BLAIR: It's actually incredible how much time it took me to do the math to get the right answer. 

DAVID: I was doing the math as you were talking thinking okay I know 10 cents ...

BLAIR: I know, I could smell it. 

DAVID: ... and I knew that wasn't the answer but I'm trying to think, "Wait wait wait, what is it," and I couldn't come up with it. It's sad isn't it? 

BLAIR: Well in everyday life that answer is good enough and I think that's one of the reasons why heuristics are important. They're not to be dismissed. They're basically short cuts. Now the key in leveraging cognitive biases is you think of it as it's almost like a neural hack. You're hacking. You know others are taking short cuts. So how do you hack those short cuts and at the same time we're thinking about ourselves. You know that you're prone to these mental short cuts and sometimes you're going to make mistakes based on them so how do we rethink our own decision making. 

BLAIR: We're gonna talk about how these ... We've got a list here. You're going to love a bunch and I'll talk to them as best as I can and I'll give business examples. In some of the examples, the more interesting one are the social examples. We'll try to bring it back to business but we'll end up talking about some kind of a larger social issues as well. Primarily because I really have the urge to. 

DAVID: What you just said, I thought it brought this home to me. We know that the people we're talking to in a new business situation are taking mental short cuts so how do we nudge them without manipulating them. That makes perfect sense. Are we ready to dive in here? 

BLAIR: Yeah, lets do it. 

DAVID: So we have about ten of them. We'll see how many we get to. The first is anchoring. Talk to me about the anchoring bias. 

BLAIR: Anchoring affect, anchoring bias also known as anchoring and adjusting. I think we've talked about this before at least once in the podcast we did on the book itself. Anchor high is one of the rules. One of the six rule of pricing creativity. Anchoring is essentially the tendency for the first piece of information on a topic to skew the final outcome or the final decision that somebody makes.

BLAIR: You and I did a seminar together, a two day seminar. I think it was two days. In Atlanta, Georgia. It was about five years ago. I think it was somewhere around 2015. 

DAVID: Yeah. Oh I remember this like it was yesterday. This was your idea, "Lets do this." And I'm thinking, "What? What?" It was so fascinating. Remind people what we did here. 

BLAIR: We had 70 people attend from fairly geographically far afield. We had people coming from around the world, primarily North America. In advance of the seminar I sent everyone a question via email. The question was thinking of all of the firms who will be in the room ... Nobody knew who else was going to be in the room. Thinking of all the other firms who will be in the room, what's your estimate of the average blended hourly rate. The control group answer ... which implies there were two groups. Everybody got this control question. The control group answer was 171 dollars. So on average the people who just got that question said, "Oh I think the average blended hourly rate of the firms in the room will be 171 dollars an hour."

BLAIR: Half of those people also got an anchor question before hand. The Anchor question was this, "Thinking of the firms that will be in the room do you think the average blended hourly rate will be higher or lower than 500 dollars an hour." You get this question and then the anchor group average answer was 237 dollars instead of 171. It was 38 percent higher. Why is the anchor group, the group that has been anchored with that question about higher or lower than 500 dollars, why is on average their answer higher. 

BLAIR: It's because when I asked the question they started to think that's a really high number. Everybody answered it's going to be lower but by asking the question they start to think of the more expensive more sophisticated firms that they know. I'm planting the seed essentially and their skews higher. 

DAVID: Right.

BLAIR: So it's known as anchoring and adjusting. We jump to conclusion using what Kahneman calls system one, this cognitively efficient way of thinking, and then we employ the more cognitively expensive rational system two that takes more brain power. We employ that kind of reasoning system to reason away or adjust away from that anchor. And what all the science shows is, there's kind of this range of acceptable or rational responses but we stop moving once we get to the outer edge of what we conceive to be a rational response. 

DAVID: Right. You and I have a fair bit of overlap in our client base and so I'm constantly running into people that know you and have worked with you or want to work with you and so on. And invariably, what they say after I mention your name is, "Oh yeah, I have learned so much by using this anchoring effect." And then they go on and they talk about how you apply it specifically to marketing firms. Firms under the marketing umbrella, entrepreneurial creatives, and the impact on pricing. And it has had a staggering effect out there. So it's really interesting to me how basically the anchoring is anchoring your work out there. And then we share this result with people just so that they knew that this really happened. It really happened to them, it doesn't just happen to other people. 

BLAIR: There are three identified ways where you can undo the anchoring effect. People keep doing studies on this, but nobody knows what they are. I mean, I know what they are, but nobody knows what they are. So even when you know somebody's anchoring high against you, you can't really undo the effect on it. 

BLAIR: And creative people anchor high all the time. If you go into your client's office to present creative concepts or work of any kind and you've got the one idea that you really want to sell but you're worried it's a little bit too out there for the client, what you do is you create this even more radical idea and you put that one forward first. Because the job of the anchor is to make the other options look less ... 

DAVID: Unreasonable. 

BLAIR: Yeah, yeah, look more reasonable or more desirable. So creative people anchor all the time. In pricing, you're anchoring high, it's the first number that people hear should be a high one and then they make decisions from there. There's all kinds of other kind of science behind this. I'm adapting something from Dan Ariely's book, Predictably Irrational. He says, "The price becomes an anchor when the client contemplates buying a service at that particular price." So you just have to get them trying on, paying multiples of what their budget was first, and then you can lead them to another option that's just maybe one or two multiples of it. 

BLAIR: And I'll tell you ... I know we're going on on anchoring. We could do a whole show on anchoring, but a friend of mine and somebody you know as well, you published his book Cal Harrison, The Consultant With Pink Hair, he wrote an article about a consulting engagement he sold at 300,000 dollars. The client's budget was 30,000 dollars. So he 10 exed. He got them to choose an option. He presented multiple options. He presented three options. He got them to move from 30,000 to 300,000. And the way he did that, you might think that was the anchor. No, he led with the anchor option. And the anchor option was priced at 30,000,000 dollars. 

DAVID: Right. 

BLAIR: So the client says, "I have a budget of 30,000 dollars." And you go in and say, "I have three different ways we can work with you. The first one I'm going to present to you is 30 million dollars." Then the client falls out of their chair and they can't even hear what you say next. All they can hear is their heart in their ears. 

BLAIR: But then they actually contemplate. They try this on, it becomes anchored, it becomes coherent in their mind, and then when you move to the cheap one. Here's what we can do for 30,000. It's not nearly as much, it's not nearly as elaborate. Now, if you're interested in the one in the middle, it's only 300,000. 

DAVID: This is a little bit dangerous, okay, where I'm headed now is thin ice. You can hear the crackling underneath all of this. But the opposite of anchoring is when a firm goes in and they have a real minimum price and then they sock all kinds of change orders into the project. And rather than aiming high and then pleasing the client at the end by coming in at or slightly below, I know that's really an abuse of the anchoring concept, but I feel like the opposite of anchoring in some senses is trying to get in there low and then charge more as opposed to handling all of this a little bit more psychologically and transparently at the beginning. 

BLAIR: Yeah, I would say you're right. It is slightly thin ice but I understand what you're saying and I think the principle applies. You're saying, "Don't go in low and change order people to death." Start really high, the job of the anchor price if your presenting option is not to sell that anchor option, is to make the other ones seem more desirable. Which is a natural segue into the next bias on our list. 

DAVID: Yeah, the decoy effect. And before we get into the second one, some of you are listening to this while you're driving and you're madly trying to write down all the names of the books and the authors and so on. We'll put all those in the show notes so that you can just enjoy and not worry too much about pulling over or running into somebody. And I think this may end up being a two part podcast where we get through a bunch of them and we pick up the rest later if we need to. 

DAVID: So the second one is the decoy effect. Let's talk about the decoy effect. 

BLAIR: The decoy effect, some people use it interchangeably with anchoring. It's not the same idea but there are some similarities. It's essentially a change in preference between two options when a third decoy or almost irrelevant option is presented. And the technical name for what's going on with the third option is an asymmetric dominance. When an option with an asymmetric dominance is added. So the most often cited story of the decoy effect also comes from Dan Ariely and it's quoted many, many different places. 

BLAIR: So Ariely is reading something and he sees an ad for subscriptions to the Economist Magazine. And he thinks, "This is really interesting. So there are three options here. I can buy the online subscription for 59 dollars, I can buy the print subscription for 125 dollars, or I can buy the online and print combined for 125 dollars." 

BLAIR: So the decoy in there is why would anybody chose the print option only because if you're interested in the print, you're also going to get the web for free. So the print option is there only as a decoy. Nobody chooses the print option because for the same price, option three, you can get the print and online version. 

BLAIR: So he decided to do some studies around this in his MIT MBA class and what he found is when presented with the decoy, those three options, an online subscription for 59 dollars, print for 125, or print and online 125, as you would guess, nobody chose print only, 84 percent chose the print and web version and 16 percent chose the online version. 

BLAIR: Now as soon as you take the decoy out, the numbers change, or the results change radically. 68 percent choose the online version. So where it used to be 16 percent choose the cheap price, it's now over two thirds choose the cheap price. It goes from 16 percent to 68 percent. And the print and web version goes from 84 percent down to 32 percent. So by removing that decoy that nobody chose, total sales drop by 44 percent. 

DAVID: Wouldn't you think it was just a typo if you saw it though?

BLAIR: It's a pretty elaborate typo. There's three different options laid out in the ad. Now this has got a lot of traction. It's been quoted many times and there's been other studies done in support of it. So the idea is if I give you two options and let's say you are inclined to choose option A, I can sometimes get you to choose option B by introducing option C knowing that there's no way you're going to take option C. 

DAVID: Yeah, not the option is a typo but the price would be a typo. 

BLAIR: Oh, oh, gotcha. Yeah, yeah.

DAVID: Yeah, I mean because it just seems so illogical. So how would this ... In an application to a firm that's listening to this, how would this differ from the anchoring and three options and so on?

BLAIR: Yeah and we'll tie it to anchoring. We'll tie it into the next one, this idea of extremeness aversion or extremeness bias. Let me just speak to that and then we'll come back to this. So extremeness aversion is the idea that people tend towards the extremes in decision making and they retreat to the safe middle. 

BLAIR: So when you're putting forward three options to the client, know that their tendency is to move to the middle option. So you want to architect your choices with the assumption that people will choose the middle one. You want the middle one to be a high margin option to you and you treat the anchor essentially as a decoy. 

BLAIR: So when you have three prices and the highest price really technically is only the anchor. It's always a decoy, but it's only an anchor when you lead with the high price because anchoring is about the first piece of information. So you see how these two concepts go together. If you were to put forward three options and you start with the least expensive, then you go to the middle, then you go to the highest one, you are employing the decoy effect and you're taking advantage of the extremeness aversion. That likelihood that somebody will go to the middle. 

BLAIR: But when you start with the high option, the anchor, now you're also employing the anchoring effect and you're increasing the likelihood that people will choose the middle option. And of course, from time to time, they'll choose the expensive anchor option and you celebrate like crazy. 

DAVID: So if you are in a conversation with a prospect, are there some clues that might indicate which of these three or combination of three you might employ to move the discussion, to nudge the discussion in the right direction? Is there something that you're listening for that would say, "Oh, we need to do maybe the decoy effect. Or no, no, we need to start with the really high one." 

BLAIR: No, I think you're employing the decoy effect and the anchoring effect at the same time and simultaneously leveraging extremeness aversion. So you are anchoring high. You're beginning with a high number and you're putting forward a high number. It's not a full on decoy in that it would be an irrational decision if somebody chose it. 

DAVID: Right. 

BLAIR: Right? So a pure decoy is just like I'm gonna offer something that I know they're not going to take, but it's going to skew the decision ultimately that they make. But I just know that they're not going to take that. So in some ways, it's a decoy, but you want to think of it primarily as an anchor. But you're effectively using the two principles simultaneously. 

DAVID: And I think part of what keeps us on the right track ethically is to design choices that actually somebody, somewhere, some day might take. And if they do, they're going to get tremendous value from it. It means that they want a very, very thorough approach and money is not much of an object and they really want to do this. Very few people will choose it. So that seems like there's a little bit of an ethical chalk line underneath that. 

BLAIR: You're trying to limit the hate mail. And I think when we just talk about this principle or these two principles in isolation of other aspects of value base pricing, I can see how people can get their hackles up. But when you package it up with things that we've talked about previously on this podcast, the idea that you should be just contently focused on creating value for the customer and the options that you're putting forward are three legitimate options on how you can create value, then kind of the subterfuge loses what I think is the gloss of the dark arts, right? But yeah, again, in isolation when we're talking about these things, it does sound like pure manipulation. 

DAVID: Right, okay. 

BLAIR: And I enjoy that. 

DAVID: Can we move on from these first three or is there anything that you wanna say to tie them together? Are we good?

BLAIR: No, we are absolutely not gonna cover all of these in this one so let's keep going and you decide where we draw the line. 

DAVID: Okay. So let's take a quick break and then we'll come back and we're going to talk about survivorship bias. 


DAVID: Let's talk about the fourth one, survivorship bias.

BLAIR: Yeah, survivorship bias is really kind of related to sampling bias or the law of small numbers. It's essentially where we exclude failures from the sample when we're making a decision. 

BLAIR: So the best example I can think of in the business world of our audience is when you're advising a small design firm, or designer to specialize, and they push back and they say, "Well, I was at a talk recently and I saw x, this rockstar designer who is this famous designer generalist, I saw him say, 'Do not specialize, you need to build your practice so that you are free to design whatever it is you want to design.'"That person on the stage, the rockstar designer, is subjected to the survivorship bias. Where essentially he's saying, "Look at me, I am proof that you can do this." And I say, no, he's the contradiction that proves the rule. 

BLAIR: Survivorship bias means he's essentially neglecting every other designer on the planet who has attempted to do what he does and has failed. The fact that there are only a small number of people who are able to do this. So when someone says, "Look at me, I did this. Or look at Apple." Well, Apple is the exception, right? If you just look at Apple and you don't look at every other company that tried to do what Apple does or attempted et cetera or everybody else who attempted to do what you did or what your sample or example did. If you do not include the failures, then it's not statistically relevant. 

DAVID: Right, so would this also be an example? So the person's in the audience and somebody says, "You should specialize." And they raise their hand and says, "Yeah, but I know this firm that did and then the bottom fell out of the market date so that's why I don't want to do it." Is that another example on the other end of the extreme? 

BLAIR: Yeah, that's a great example. That's the law of small numbers. It's just extrapolating these big kind of decisions or what you would see as facts or certainty from a very small sample size. And there's also the recency bias, which is the most recent thing that happens to you tends to kind of skew your thinking around what the averages are and what the probabilities are. 

DAVID: So how would this apply to a sales setting? How would survivorship bias?

BLAIR: Again, not so much a sales setting, but a positioning setting. 

DAVID: Oh, gotcha. 

BLAIR: Yeah, where you're looking and thinking well so and so is doing it or so and so is saying, "Well, I did it." Well, yeah, you're a sample of one. 

DAVID: Yeah, right, or there might be an objection from the prospect that throws something back into your face and you might have to just kind of chuckle and say something sort of somewhat interesting and explain, well that's sort of an exception. Here's really the rule.

DAVID: Yeah, okay. So that's survivorship bias. What about social influence? That's an interesting title.

BLAIR: Social influence? 

DAVID: I need to learn about social influence, yeah. 

BLAIR: Why is it interesting?

DAVID: 'Cause I don't have any.

BLAIR: Because you're antisocial? That' known as the David Baker effect which is the opposite of the bandwagon effect. If the bandwagon effect is an example of social influence, it's like well, if Billy jumped off a bridge, would you do it too? 

BLAIR: Yeah, my mom used to say stuff like that to me all the time when I was a kid because I was so easily influenced by my peers. I was so easily influenced so she would always accuse me of it. And that's essentially what it is. It's the effect of others on our emotions, our beliefs, and our opinions. 

BLAIR: So the bandwagon effect is an example. So you think of conformity, socialization, peer pressure, obedience, leadership persuasion. Those are all kind of functions of social influence. How others seem to have an effect on you. And the way you can leverage this in the sale is when you're putting forward options, you can point to the middle one and say most of our clients hire us at this level. Or almost all of our clients engages to do the execution as well as the strategy and ideation. 

BLAIR: So that's an example of bringing others into the room to kind of try to generate some FOMO or this idea that well, these other people must know something. And you see it in SAS pricing all the time. SAS companies are sophisticated pricers so you'll see three or four options and the one they want you to buy usually has a little call out that says most popular. 

DAVID: Most popular ... Yeah, so you might even say ... And I found myself saying this when I'm working with clients where they're objecting to a particular direction that I am suggesting. And I'll say, "Listen, we don't have to do that for sure. What I just suggested is not essential at all. We can keep going and you can be the same kind of firm you have been," and then to not be disingenuous I'll say, "And here are the results. You've made a good living, you've got a good culture, but you're hiring me to help you do better. We're not trying to keep you busy as a firm, we're trying to generate a price premium." 

DAVID: And I've heard you say the same thing too. So just explaining to them that, "Listen, we're trying to be a little bit different than the norm." The social influence. They're trying to make you above that. This whole FOMO thing. 

BLAIR: Yeah, social influence might cause that to backfire on you. If you could frame the argument a little bit differently. If you could say, "Listen, you're down here at this level, but all of the firms that I work with get to this level or aspire to this level." Be a part of the herd, don't get left behind, don't miss out on what everybody else is experiencing. That's the way you would want to leverage social influence. 

DAVID: Alright, so Blair, we've covered the more difficult ones, the first five. We have five more that are a little simpler to cover and the next one is the confirmation bias. This one I hear a lot so just summarize the intent behind this one. 

BLAIR: Man, we could do a whole episode on this, but I think this is the one that's more socially interesting and less practical in terms of business. Confirmation bias is simply the tendency for more information to confirm our existing beliefs instead of to challenge them. So it kind of speaks to how we search for, we interpret, and we invite, and we favor information in a way that supports our beliefs. It's essentially the effect of our desire on our beliefs. We want things to be true so we look for information to support that it's true. 

BLAIR: In the business context, I think of the iceberg of ignorance. The idea that somebody who is at the top of an organization is surrounded by sycophants typically. And those people bring that person information that aligns or supports their point of view and doesn't challenge it. 

BLAIR: And I think we may have talked about this before, I find it happens to me all the time. People that I know well, or sometimes people that I barely know send me information that supports my beliefs on how new business should be done. The idea that you can and should win without pitching. 

BLAIR: And every time I get one of these articles or stories, I think, "Okay, where are the five that challenge it? How come people aren't challenging my beliefs?" 

BLAIR: And the more you express your beliefs, the more inclined people are to bring you information that supports your beliefs. 

DAVID: Ah, I'm thinking of basically, circles of influence on Facebook and ... 

BLAIR: Oh, yeah! 

DAVID: Their algorithms even encourage this sort of thing. And we look at how our political environment has been shaped recently and how this such is so related to confirmation bias. 

BLAIR: Yeah, and all these next topics really could go down that rabbit hole of what's going on socially and politically today. And how the AI bots of social media are confirmation bias bots. So the more information of the certain type you look for, the more you're presented, the more your belief gets cemented and cemented. 

BLAIR: Which leads us to the next bias, which is certainty bias, which is the idea that ... Well, Richard Feynman, who is the greatest physicist of I guess the last generation had this line that in physics, nothing is certain. There are only probabilities. But your argument is more convincing when you put forward something as a certainty or when you communicate your certainty. 

BLAIR: Like nobody bangs their fist on the table and says, "There's a 66 percent chance that I'm right." 

DAVID: Yeah, right. You can't stand up in front of church and say, "I really believe there is a God." It just doesn't have the same ring to it, right?

BLAIR: Yeah, I'm 82 percent certain that there is a God. But in physics, if you get into QED, quantum electrodynamics, like when you bounce a photon off a mirror at a 45 degree angle, you can't say with certainty, that that photon is going to refract at a 45 degree angle. It's all probabilities. And the math that they use to calculate the probabilities actually includes the possibility that that photon takes every possible path through the entire universe before it gets to that mirror. 

BLAIR: That's probably going too far down in the rabbit hole, but there's an entire branch of science that says there is no certainty in this branch of science, there's only probability. So they put forward things ... Like most responsible scientists, or at least physicists, put forward things as probabilities. 

BLAIR: Now I listened to a recording of a cognitive scientist whom I won't name talk about eight certainties in life. And the first one he identified was the universe was created 13.79 billion years ago. I'm thinking no physicist ... He's not a physicist, he's a cognitive scientist. No physicist would ever say that. What they would say if they were rational is they would say, "Well, the consensus opinion is that the universe was created 13.79 billion years ago in a big bang, but there's a lot of holes in that opinion. There's a lot of stuff that we don't know. There's a lot of assumptions, there's a lot of errors et cetera." That's kind of the responsible answer. 

BLAIR: You don't stand up there in front of a classroom, a teacher and say, "Well, these are the odds, you project the certainty." And we're all guilty of banging our fist on the table and saying, I'm certain, it's a slam dunk. 

BLAIR: Some studies have been done that show that when somebody says, "I am 99 percent certain," then there's only a 40 percent likelihood that they're right. 

DAVID: I think as you were talking about that I was getting this itch to say this ... The more of an expert you are, the more comfortable you should be able to live with uncertainty. 

BLAIR: So that's a related bias known as the Dunning-Kruger effect that says amateurs or idiots tend to overstate their certainty and knowledgeable people or experts tend to understate it. In the beginning, when you know nothing, you say, "I don't know anything about it. I'm completely ignorant." But then you learn a little bit, but not enough to know what you don't know. So you learn a little, but you think, "Oh, I've got this. I've nailed this." Then you put forward this very forthright, sometimes emotionally charged opinion because you have a little bit of information but you are vastly overstating your certainty or your opinion based on a little bit of information. Then as you learn more, you start to learn what you don't know. So that's why experts, they understate their opinion or their certainty because they are aware of the complexity of the situation. 

BLAIR: And there's a corollary here, it's called the illusion of explanatory depth. It's basically, if you've got somebody pounding their fist on the table saying, "This is true," and they're emotionally charged, first of all, you should infer that they don't actually know much about the topic. And if you want to moderate their emotional reaction and you want to moderate their beliefs, simply ask them to go deeper into the subject matter and explain it to you. 

BLAIR: Don't challenge them, just say, "Oh, that's interesting. Can you tell me how this works?" And as they're forced to give you more detail, they come face to face with the realization that they actually know very little about this and then they moderate their stance and they moderate their emotions. 

DAVID: I think understanding that makes for such better relationships with your clients because you were saying it's somewhere in the middle. Like I do know a lot, but I don't know everything. And I have been wrong multiple times and this could be one of those times when I'm wrong as well, I'm open to that. But I also want to give service to the fact that I have thought a lot about this and done research. And so my initial response to you is this ... Recognizing that I want to listen very carefully because I might be wrong here. 

DAVID: So there's just a way that an expert has a conversation with a client that makes it more of a ... Not collaborative, that's too far, but makes it a little bit more human and maybe open to learn on both sides. But you still have to have an opinion that's informed, but you have to be willing to listen as well. 

BLAIR: I think that's a great way to summarize that and I think ... When you and I, when we're doing these podcasts, we are heard our most self-aware. I mean I sometimes ... We sound so balanced and smart. 

DAVID: When was that? When are you ... Which particular episode are you thinking of? I need to go listen to it again. 

BLAIR: But we're hot heads too. And we react in the situation and we're subject to biases all the time. And when you stop and think about things .., I think if there's an overarching piece of advice that I would just give to wrap all of this up is when you're thinking about the biases that you are prone to in your life, the simple advice is to stop and to increase the space between the stimulus and the response. To engage system two, to let the emotions kind of diffuse a little bit. To think more ... Because as you go deeper into the subject matter, if you have an emotional response to something, you probably don't understand it very well. 

BLAIR: So you're responding instead of going to Twitter or Facebook and then pushing your response out there, just think more deeply about the situation. Engage in system two. 

BLAIR: So we started talking about leveraging the biases and others and I guess we ended with once again a little self-help podcast of how you can become a better person. 

DAVID: Yeah, that's right. Well, have we beat this to death?

BLAIR: Yeah, I think we've run out of time. 

DAVID: Yeah, but the other ones were framing effect, sunk cost bias, gambler's bias, if people want to look these up. But what else do you want to say about any of these?

BLAIR: I think I wanted to talk about gambler's bias and what I call the hot hand fallacy fallacy. Do we have time? Do you want to do this?

DAVID: Sure, go for it. 

BLAIR: Gambler's bias is ... Well, let me ask you. 

BLAIR: If I ask you to flip a coin, what are the odds it'll come up heads. 

DAVID: 50 percent. 

BLAIR: Okay. If you flip three coins in a row and they come up heads every time and I ask you to flip a coin a fourth time, what are the odds that it'll come up heads the fourth time? 

DAVID: Well, statistically, I know it's 50 percent. 

BLAIR: Yeah, statistically you know it's 50 percent. 

DAVID: Because I know it starts over. Math does not have a memory. 

BLAIR: Yeah, right, yeah. But the gambler's bias is well the last three ones were heads so the next one is likely to be tails. But as you say, everyone starts over. 

BLAIR: So it's essentially not fully understanding probability. And I think there's three levels of understanding probability. There's most of us who don't understand it at all and we see things like streaks and a new manager bounce in sports when the sports team fires the manager, hires a new one. We think, oh, those are actually real things. 

BLAIR: And what we don't understand is it's basically everything reverting to the mean. Or some other basic principle of probability. 

BLAIR: And then there's the second level is when you understand these things and you rationalize just the way you did right now. Well, I know intellectually that it's 50 percent because you understand the basics of probability. 

BLAIR: And then I think there's the third level of probability where somebody like Nassim Nicholas Taleb who is a very smart man, but if you look at his twitter profile, he thinks everything in the world is probability. So I think the third level is where you describe way too much of how the universe works to probabilistic stuff. 

DAVID: Oh, man you just messed up my world. I thought I understood it and then you throw that at me. 

BLAIR: No, I think you're at the right level of understanding probability. My point is, I know some scientists who I think go too far on probability. And they say things like, "Yeah, well, people don't understand that improbable things happen all of the time." Well that's actually very true. Improbable things happen all of the time. But they then use that to explain away just unexplainable phenomenon. 

DAVID: Okay. So if people want to do some more research on this, let me just list the 10 again so that they can look these up. So anchoring, decoy effect, extremeness, survivorship, social influence, confirmation, certainty, framing effect, sunk cost, and gambler's bias. 

DAVID: And this has been fun. I feel like I've learned more than average. You must have been reading and you're dropping names like they mean nothing to you. This is very impressive. 

DAVID: Next time we talk, and you're interviewing me, I'm gonna have to come up with some much better stuff than I have been in the past. 

BLAIR: Well, I did have to prepare for this one and I still resent you for that. 

DAVID: Okay. Thank you Blair. 

BLAIR: Thanks David.


David Baker