Your brain is biased by default. Here’s how to reset it. | David Eagleman
- Why do we accept our reality as the uncontested truth? You are a data collection machine that moves through the world and you vacuum up your little bits of experience that you have, and in the end, whatever you have, that's what you assume to be true. But our experiences are limited. We're born on a particular spot on the planet, and we have a thin, little trajectory of experience, and we construct what we believe the world is made up of from there.
And as a result, we all have a very limited view of what's going on out there. The interesting thing about being a human is that we're stuck inside our internal model - it's all we ever see. But with the endeavor of science and literature and philosophy, what we're able to do is step outside of ourselves and understand, "Hey, the way that I see the world isn't the only way to see the world. It's not the only truth."
And the more we can get good at that, the more we can try to build a better society. My name's David Eagleman. I'm a neuroscientist at Stanford, and I run the podcast "Inner Cosmos." The interesting thing about the human brain in particular is that we drop into the world half-baked with a certain set of genetics, and then experience wires up our brains.
And what that means is our brains are extremely flexible. So whatever moment in time you're born in, whatever culture you're born in, whatever deities your culture believes in, whoever your parents are, your neighborhood, and so on, you absorb all that and that crafts who you become. Now, as a result of the genetics being different, your brain wires up in slightly different ways.
My interest in searching out the genetics here is to define a new field called 'perceptual genomics,' which is understanding how slight tweaks in your genome and yours and yours leads to us seeing the world in a different way. In other words, how do the genes that you come to the table with change your perception of reality?
For example, how clearly you visualize something on the inside: if I ask you to picture an ant crawling on a red and white tablecloth towards a jar of purple jelly. You might perceive that as a movie in your head, or you might perceive it without any picture at all, but just sort of the concept of it. People have completely different internal lives.
Your genetics and life experiences might be different from mine, which makes our models somewhat different from each other - and that's true for all 8 billion of us. So our brains are very predisposed to forming ingroups and outgroups. We form the psychological thing of trusting our ingroup and caring about our ingroup, and not so much about the outgroup.
Presumably, this has an evolutionary basis because we grew up in small tribes and you knew who your folks were in your tribe, but that other group across the hill, you have no idea who they are; you don't know if they're enemies. And so, we constantly form the groups that we belong to, whether that's predicated on our country or our religion or our favorite sports team.
We care more about the people who agree with us, and we're very suspicious of the people who are in the outgroups. One of the amazing parts about human brains is our sense of empathy. Well, it turns out that when you're dealing with somebody in your outgroup, you have less empathy. You just don't care about them as much.
In my lab, some years ago, we did an experiment where we put people in the brain scanner. They see six hands on the screen, and the computer goes around, boop, boop, boop, boop, boop, and randomly picks one of the hands, and then the hand either gets touched with a Q-tip or it gets stabbed with a syringe needle.
And when it gets stabbed, these networks in your brain that are involved in pain come online. But now what we did, is we labeled each hand with a one-word label: Christian, Jewish, Muslim, Hindu, Scientologist, Atheist. And now the question is: Does your brain care as much if it's a member of your outgroup? And the answer is: Your brain does not care as much.
If your ingroup member gets stabbed, you have a big response in this area, this pain matrix. If an outgroup member gets stabbed, you have a smaller response. This is true across all groups that we measured. This isn't an indictment of religion. It's just about who's in your ingroup and who's in your outgroups in this case.
When you look at any conflict in the world, and you have two sides that hate each other, when they're looking at the other side, they don't think of them like a human, like a person. They think of them like an object, and therefore the medial prefrontal cortex and other areas don't even come online when considering them.
So, the internal model that you form growing up in your life, that's what determines who's in your ingroup and who's in your outgroup. And obviously, we see what happens once people start traveling and going around the world. They expand their internal model, they expand their ingroups greatly.
But it's very easy if you haven't been to every country in the world, which most of us have not, to still feel like, "Oh, that group, that culture, whatever, is totally foreign to me." The first step to expanding our narrow models is to recognize that there are fence lines and that there are things we're not seeing.
So the first thing is to understand our own biases because we can't help but have biases. And the question is: Is there something that we can do given that situation? Many people are familiar with the way that orchestras many years ago started having their auditions behind a screen so that they wouldn't be biased, male, female, white, Black, whatever - all there is the music pouring over the screen, and they make their decision about who's the best orchestra player that way.
There are many ways that you can blind your biases. The second strategy is learning about the tactics of dehumanization so that you can be more immune to them. For example, there's what's called 'moral pollution,' where you associate members of your outgroup with something repulsive.
Whatever's coming outta their mouth now, everyone is a little less eager to hear that because that group has already been smeared. And as we learn about these tactics, we can be immune to them. Once we do that, that gives us the opportunity to build a richer model of the other person.
The third strategy is entangling group membership or complexifying your allegiances. If I meet you and I learn that you like surfing and I like surfing, and you like this kind of dog, and I like that kind of dog, and we find all these things about one another that we have in common, that's the stuff that allows people to bond.
And only later something comes out where we realize we have a disagreement, and then I say, "Wow, that's interesting. I wouldn't have expected that. Tell me about that. Let me understand that a little bit better, because we're already bonded based on other pieces."
So this is a way of making sure that we can improve communication across these gaps between each brain and every other brain. It's a way of making sure that we understand that everyone's not experiencing reality the same way on the inside.
And what you need are deeper bonds that hold people together, so that you understand other people as fellow humans in a way that's more fruitful and beneficial for our future.