Placeholder Image

Subtitles section Play video

  • The English philosopher Bertrand Russell once said, "The whole problem with the world is that fools and fanatics are so sure of themselves, while wiser people are so full of doubt."

  • In psychology, what Russell described is more popularly known as the Dunning-Kruger effect.

  • This effect finds that people who are bad at something tend to believe that they're actually good at it and people who are good at something tend to believe that they are bad at it.

  • Elderly people who believe they're better drivers than most are actually four times more likely to make unsafe driving error.

  • Gun owners who think they're highly knowledgeable about gun safety score the lowest on tests about gun safety.

  • Medical lab workers who rate themselves as highly competent in their jobs are actually the worst at their jobs.

  • The lowest performing college students dramatically overestimate their performance on exams.

  • The lowest performers in a debate competition wildly overestimate how well they do.

  • People with the unhealthiest lifestyle habits rate themselves as far healthier than they actually are.

  • People who score poorly on cognitive reasoning and analytical thinking tests severely overestimate their cognitive and analytical abilities.

  • But why does this happen?

  • To understand, let's break knowledge down into four quadrants.

  • So there are known-knowns, things that we know that we know, like I know I know how to ride a bike.

  • There are known-unknowns, things that we know that we don't know.

  • For example, I have no fucking clue how quantum physics works.

  • And then there are unknown-known, things that you forgot, you knew or you don't realize that, you know, like you still remember how to drive to the supermarket from your childhood home. You just forgot that you knew that.

  • And then there are unknow-unknown, stuff that you don't know that you don't know.

  • When we are an amateur at something, we are very aware of the things that we know we know, and we're completely oblivious to the things that we don't know.

  • Let's use basketball as an example.

  • If you know nothing about basketball, it seems simple enough.

  • You throw a ball into the net, you know what you know, and don't know what you don't know.

  • But as you start to learn more about basketball, you discover that there are a lot of nuances;

  • how you shoot the ball, the mechanics of your elbow, wrist and forearm, how you position the ball in your hand,

  • understanding the different shots: a fadeaway, a jumper, a layup, a finger roll, an alley-oop.

  • You're beginning to become aware of all the things you don't know, and there's a lot that you don't know.

  • Let's say you spend another year working on basketball, you've mastered a bunch of different shots and learned to shoot with good form.

  • Now you're getting into the weeds of defensive schemes, hand checking, picks and rolls, setting various kinds of screens.

  • At this point, you're no longer even thinking about your shooting form or how to hit a free throw.

  • You've forgotten you know this stuff, it's unconscious, it's automatic. It's the stuff you know, but you forgot you know, and there's a ton of it.

  • As you can see, the difference between an amateur and a professional is that an amateur's knowledge is known to them, therefore, they get to feel smart about it.

  • But an expert, so much of their knowledge is either unconscious and automatic or its knowledge of what they still need to learn.

  • Another way to visualize this shift is to think of knowledge as a circle.

  • The area within the circle is what you know about a topic, and the border is the horizon of your knowledge or everything that you're aware of that you don't know yet.

  • This border is what determines our uncertainty or doubt.

  • Interestingly, as the size of your circle grows larger, the horizon of your knowledge also grows larger; the more you know, the more you know that you don't know.

  • But something else happens as well as you gain knowledge. As you implement information and it becomes automatic, you forget that you know it.

  • So there's a second border inside the first, this smaller circle is everything that you've forgotten you know.

  • So not only is the experts horizon of doubt much longer, most of their knowledge is also unconscious.

  • They forgot that they know it because it strikes them as so obvious and immediate why even think about it.

  • The idiot thinks he knows everything because he literally doesn't have enough knowledge to know better.

  • Meanwhile, the expert thinks he knows nothing because he is so aware of all the ways in which he may be wrong.

  • Now I know what you're probably doing right now.

  • It's probably the same thing I did and most people do when they learn about the Dunning-Kruger effect, you think to yourself, "What a bunch of fucking idiots."

  • Good thing (is) I know about this Dunning-Kruger effect thing because you know, I'm super aware of all my flaws that makes me like an expert at fucking everything.

  • So this is the tricky thing about learning about cognitive biases.

  • We would like to think that because we're aware of all the ways our mind fucks up that we are somehow immune to those fuck ups.

  • But once again, we are so wrong because again and again, research has shown that educating people about their cognitive biases doesn't really make them any less susceptible to cognitive biases.

  • And that is the most frustrating thing about the Dunning-Kruger effect.

  • It is so hard to overcome both in others, but also in ourselves because here's the thing, they're called blind spots for a reason.

  • You can't fucking see them.

  • How do you fix something that you can't see in yourself?

  • This is the paradox of trying to overcome our own ignorance.

  • The very thing that would help us see our mistakes is exactly what would prevent us from making them in the first place.

  • Part of the problem is that there is a comfort in the feeling of knowing. People don't like uncertainty, settling on a belief, whether it's true or not is a way to resolve anxiety within ourselves.

  • So our minds often default to believing things even if we don't have a whole lot of evidence for them.

  • And unfortunately, ripping on people for being fucking stupid doesn't really help the situation either.

  • Anybody who's gotten in a dumbass argument and comment threads can tell you this from experience.

  • Again and again, psychology has shown that when people's beliefs are challenged, they don't change their minds, they actually get more rigid and defensive.

  • So what are we supposed to do?

  • Well, starting with ourselves, I think it's an important practice to perhaps hold fewer opinions or at least hold them less strongly.

  • This means being less emotionally attached to our beliefs.

  • In other words, I think there's a lot to say for humility.

  • When you see something online that is upsetting or angering or frustrating, instead of jumping to conclusions about that person or that cause, maybe sit back and say, "I don't know."

  • What the fuck am I saying? You guys aren't gonna do that.

  • Let's be honest.

  • You know, last year I created an online course that helped people challenge their own beliefs.

  • It helped people figure out how to hold opinions a little bit more softly but funny, funny thing.

  • Nobody fucking took the course.

  • You know, for some reason, there wasn't a ton of demand for that and not to mention it's fucking hard to market.

  • Like how do you market a thing to people that's gonna make them feel wrong about everything they believe in their life.

  • That's not exactly like the most enticing sales pitch.

  • But if for some reason you want to take that course, you can find the link in the description.

  • Good fucking luck.

  • So when it comes to other people, I think one of the hard truths that I've had to swallow over the years is that you can't really change the mind of somebody who's not willing to have their mind change.

  • You can throw as much data and statistics and logical arguments at them.

  • But they're just gonna like Nin jitsu that shit, you know, pull a Neo in the Matrix and all the bullets are gonna get by him.

  • I think this is because most people's beliefs are not based on logic or reason.

  • Most people's beliefs are based on identity and group affiliation.

  • And so when you show them contradicting data, their thought process isn't, "Oh, I need to update my prior assumptions about the world."

  • Their thought process is like, "I'm being attacked. My tribe is being attacked."

  • You know, many years ago, I used to coach people and one of the reasons I fucking stopped coaching people is because it was often very frustrating.

  • Somebody would hire me for a week or we do like a monthly call or something.

  • And it just felt like I was like beating my head against the wall.

  • Like I was telling them the same thing over and over and over again.

  • Usually like deep, profound truths don't sink in for people the first time.

  • It's almost like you plant the seed in their head and then they need to go live for another year or two for that idea of the sprout.

  • It's almost like we have to be in the right environment or context or be going through the right phase of our journey for those seeds to sprout.

  • So one thing that has helped me a lot in my own relationships and just fucking tolerating all the nonsense that goes on in the world is understanding that I'm not here to change minds necessarily, I'm here to plant seeds.

  • I'm here to drop an idea or an argument.

  • So that one day if that person becomes fertile ground, that seed can sprout.

  • The most impactful things usually don't sink in right away. They usually need like a few weeks or months or even a couple of years to like incubate in a person's head.

  • And when I look at my own life, this also feels true like there were things that people told me in my teens that I didn't fully appreciate until my twenties or thirties or hell, even almost 40.

  • Ultimately, I think humility is one of the most underrated values in our world right now.

  • On the internet, people are rewarded for false confidence, people are rewarded for being bold.

  • People are rewarded for being zealots and fanatics about things.

  • But while the algorithms may reward bluster and bullshit, the real world doesn't.

  • Life is really fucking difficult and complicated and most of us don't really know what we're doing most of the time.

  • So any sense of false certainty is really just gonna cause more pain than necessary.

  • I think what the Dunning-Kruger effect really teaches us is that humility is actually very practical.

  • By intentionally underestimating our own understanding of things, not only do we open ourselves up to learn and grow more, but we also prevent ourselves from just being a fucking narcissistic ass face on the internet.

  • That is of course, until we decide that I'm the most humble person you ever met. Man, I'm so fucking humble, you wouldn't believe it.

  • Everybody thinks they're humble, but I'm really humble like I've got this humility shit down.

  • They should bottle it up and put that shit on eBay because I'm gonna make a fucking killing.

  • Humility (is) like off the charts. You can't even see, you can't even see how high it is up there, man.

  • Camera doesn't go that high.

  • It's that humble, that fucking humble.

  • And as you can see, now we're back to square one.

The English philosopher Bertrand Russell once said, "The whole problem with the world is that fools and fanatics are so sure of themselves, while wiser people are so full of doubt."

Subtitles and vocabulary

Click the word to look it up Click the word to find further inforamtion about it