yego.me
💡 Stop wasting time. Read Youtube instead of watch. Download Chrome Extension

Post-Truth: Why Facts Don't Matter Anymore


12m read
·Nov 10, 2024

This is the challenge of a YouTuber, which is, you know, pushing the record button and actually filming something. Because you never know: "Are people going to hate it?" Or "Is it good enough?" Have you thought through what you're going to say. I've not thought through what I'm going to say... (Laughter)

Ok, so I wanted to talk about the question: Why is it that right now, when it is really easy to get access to facts and information, where you could just pull up your phone and look up anything in the world... Why is it now that we have the most access to facts, do facts mean the least? Why does fake news spread now? Why are we more polarized now than ever before?

What I have to kind of admit to you is that I was a real optimist. Maybe I was naive about the internet, but my thinking about having an international communication system whereby anyone, anywhere can share anything and anyone, anywhere, regardless of say their education background or their class standing, can get access to real information through Wikipedia. Ok, my thinking was the internet was going to make everyone happier and more informed, more educated, and probably more tolerant of others around them.

The reason why I think that the internet should make people more tolerant is because it should expose people to people who are not like them. Right? I mean, is this crazy! My thinking is that as we have a platform to communicate with each other, people in diverse places, people with different interests from us, and we all have the ability to sort of debate and agree upon what is true, we all would become more accepting of the facts. Maybe I'm ranting on about this too much. Is it too much? Ok, so we can edit, we can edit (the video).

So, I guess the point is, in summary, I thought that the internet would make the world more connected, more tolerant, more educated, and more true—like more agreed upon the facts and the way things are. For a while, I think there were signs that this was happening. There was the Arab Spring where, you know, people in countries that had been kind of dictatorship or people who've been oppressed were rising up. There was the approval of gay marriage in a number of countries. There were agreed upon climate agreements that seemed to suggest the whole international community was coming together.

There are things like my YouTube channel where there can be four million people who can watch or subscribe to a channel that's about information about learning new things about the universe. Right? The whole rise of smart content on YouTube I took as an indication that things were going in the right direction. Here we can get communities together who are all interested in the same sort of quick, high-level phenomena.

You know, the same, like if I made a TV show, I could never convince anyone to make an episode about interpretations of quantum mechanics. I could never convince someone to make a video about pilot wave theory and go in as much depth as I went into my videos. But on the internet, you can find that audience because you can aggregate across the whole world, so little niche audiences across every country can all come together and say, you know, we want to learn this, and we can support this sort of enterprise.

You guys in agreement?

Ok, now here's the problem. What I didn't think about was that other communities can form: communities where people have particular agendas, ideological or religious, or you know, people with extreme biases. Normally, they're not much of a force to reckon with because they are diluted in our populations. But on the internet, just like the people who love physics, people who hate science, or people with differing ideologies, and people who are intolerant can find each other, and they can come together into these camps.

So, I think that starts to suggest why we have the problem that we have today. And then what do people do once they've found their communities, once they found their people? Well, we have meetups, but we also share things with each other online. You can think about some communities that are really innocuous, like the communities of people who are cat lovers, who love sharing pictures of kittens.

Then someone gets the idea to write like a really cute phrase like "I can haz cheezburger" on a picture of a kitten. That little image tickles our brains in a way that is really strong and really incredible, and then that spreads throughout the internet. What you have, in essence, is kind of evolution taking place; evolution of ideas on the internet.

So, I guess what I mean is, you know, one person can post this picture of a cat, and then people can vary the phrases that are on it, and the best phrase is going to win out and get itself replicated and shared across the web more than any other. And the same thing happens with ideological arguments, right? If there's some sort of debate that's happening on the internet, different versions of arguments can pop up.

When you think about two groups on the internet debating, they're not really debating each other. Each community creates what they think the opponent's argument is, and then people within that community can tweak that argument to make it worse and worse and worse to make it the most awful version of itself that pushes our buttons. And when it pushes our buttons, then we share it with everyone we know and we say, "Look at what the other side is talking about! Look at how awful it is!"

We keep evolving the ideas to make them really, really bad. Those are the ones that get shared and spread, just like in evolution; we are evolving arguments to make them the worst form of themselves. And then to make the pushback even worse, and that I think is leading to this polarization.

What's worse? The internet is organized by algorithms—algorithms that are designed to take the things that are most engaged with, that most push our buttons, that get us going, and get a sharing and liking or hating. They promote those even further.

And this, I think, is where something like fake news comes in. How is it possible that during the last election cycle in the U.S., more fake news was spread and shared than real news? That is extraordinary! And I think it happens because the internet allows groups of people to come together who have really particular views. It allows them to share things with each other—even sharing what they think the opponent's argument is, tweaking it ever so slightly, doing all of these little mutations, little alterations, and finding the ones that push our buttons the most.

Then those are the ones that get shared, those are the ones that get liked and hated, and everything. And those are the ones the algorithm elevates to the very top. So, rather than the internet bringing us together, rather than us converging around the facts, instead the internet allows us to divide ourselves into factions and to have this crazy evolution of arguments which is facilitated by the algorithms pushing us ever further apart.

That is the best that I've been able to reason about why we now care less about facts than we ever did before because, deep down, each one of us is based on confirmation bias. We naturally go out into our world and we seek things that agree with what we think rather than looking for things that disagree with what we think. That's an innate human trait.

As you can see in my video, "Can You Solve This?" it's upsetting, and I think we need to find a way to overcome this division and find a way to agree again on what is truth and what is not. Thank you.

(Applause)

Alright, alright, so are there questions? Anyone have a question or comment? We have a gentleman down the front. Yes, I'm... you know that has a video on cyber balkanization? I've heard it from Vsauce; it's a very useful term.

Say it again, cyber vulcanization?

Balkanization. Balkanization? Yeah, the 'splitternet', you know, everything so the 'splinternet'. And I love it! Cyber balkanization. I don't know what balkanization is... I guess it has something to do with the political situation in The Balkans.

Oh, cyber balkanization, yeah, yeah, yeah. Shout-out to CGP Grey's video here. Yes, this idea that you should try, you should think of your opinions like papers in a box. If you find another opinion that is better than the opinion that you have in the box, you should switch it out.

It's not the believers; you are not their opinions. Opinions are just like stuff to carry with you that should be easily changed. Very good point.

I also should cite CGP Grey's video about the evolution of ideas because I was strongly influenced by that video. I think it's one of his best and most important videos.

Yes, so mechanics that involve strong ideas that polarize people—do you think there's a way to utilize this mechanic to make danger better to bring us all together, to not being pushed around, but by utilizing it to better stuff?

Right, I mean, I personally think that the mechanics that has brought us here to division is not in its current form going to lead us to convergence. So, I think something has to change about the mechanics.

If I think about one of these options, right? Who's saying about the ideas being the paper, and you should really swap them out. It's a very sophisticated, mature idea, right? Very sophisticated and mature to think that we can have our belief set down, and then when they get challenged, we can think about them and swap them out for new ones.

That's not something that humans like to do; it's hard. I was recently in the Netherlands, and the guy who was driving this van—we were going around shooting a TV series. The guy was talking to me about electric cars, and he was saying to me, "Derek, you know that even if you get like a low emissions or zero emissions car, an electric car, half of the emissions of any vehicle come from the production of the vehicle in the first place."

So, really if you think you're cutting down a lot of the emissions of your vehicle by getting an electric car (which, full disclosure, I do have), he's saying you're not really, you know, dramatically decreasing the emissions because still half the emissions happen when the car was made. So, a better strategy than buying all-electric cars would be to not buy any cars at all and just keep driving your car a few years longer because that's going to save a lot more.

You know, a lot of the waste is upfront, and I did not believe him. I was like, "That is crazy! Half of the emissions? Half? Like I know it's hard to make a car but half of the emissions?" And you think it's going to drive for, you know, hundreds and hundreds of thousands of kilometers?

And I went and I looked it up, and it was true. I took his piece of paper and swapped it in and I moved mine out. But I think it's a really hard thing to do, and it is something that, like, I felt repulsed by—like I didn't want to swap my piece of paper out.

That's why I think I don't know that that can be the solution wholesale because we can't just say to everyone, "Hang on! A lot of you have beliefs that are not true. Can we just swap some paper out with you?" They just say, "NO! THAT'S MY PAPER!"

And so, I was going to say to me, the thing that needs to change, I would say, is some of the systems, and particularly the algorithms. Because I think Facebook—and I think they are looking at this now—needs to figure out a way not to elevate fake news but to let it languish and promote real news.

They have a responsibility because they are curating the feeds of a billion people around the world. They have the responsibility to make sure those feeds are not misleading, and that is something that they were bad about doing in the last election.

Yep. If you're determined to find the real news, you go out on a marathon of research, and usually, you dig deeper and deeper past the sort of 'junk' that's being catered to us by the algorithms.

But what you're essentially proposing is some sort of a lie detector to be implemented in algorithms—that's kind of hard to do, right? Unless it's manually altered. But "uh oh", companies like Facebook, like I don't think you can manually alter the algorithm.

I think you have to find a way, and I think the algorithms can be made. I think it's a big challenge; I agree with you, but I think there's a way to do it.

I mean, if you look at Google search results, there's constantly people trying to game what Google is doing, but you don't end up with fake news sites at the top. You don't end up with spam sites coming up top in the Google search results. That's because they battled and battled and battled to find these signals that separate what's a valuable site from what's a site that people appreciate and seems to be linked into a whole bunch of other sites, which are all part of a factual web, right?

And I think that similarly, Facebook needs to do something like that where, in essence, they look for signals of truth.

I agree, very hard, and it's going to be a constant battle, but I think you need to look for the signal of truth rather than just the signal of what pushes people's buttons.

Yeah, if you look back at, say, the Stone Age, if you had a weird opinion and you're in your little tribe, either you convinced everyone or you changed your mind. Right? So, you know, Facebook is kind of like our tribe, right?

Some people use Facebook as more like, "Oh, these are my acquaintances," but a lot of people use it like, "These are my close friends." If a close friend said something weird about something on Facebook, you comment, right? You have a discussion, and you know you don't say, "No, you're wrong, I'm not going to talk to you anymore," right? Because he's your friend. Or do you change your mind or you convince the other person, right?

But if you go on like Reddit, you're not going to convince, like, 10 million billion people, so yeah. I don't know, maybe like if you've ever tried the controversial tab on Reddit, that's usually where some of the most of the interesting discussions are. But yes, sometimes it's not so important.

And I think this brings me to one of my kind of points that I think is a little bit counter-intuitive, and it's a point that's difficult for me to think about. I'm still not settled on this idea, but I wonder sometimes that arguing just makes things worse. You know, as someone who makes science films, if I go out to make a film about something that's controversial, I don't know if maybe I'm just making it worse.

And so, I think we ought to think about that really carefully before we engage in discussions, which is: Is there a chance by making these points, which you think are really clear and really get at the truth, is it possible that you were just causing the other side to be more deeply entrenched in their beliefs in the first place?

Down here. Do you think that adapting to algorithms to identify true statements can have some issues with like separating truth? Like uncontroversial statements from controversial ones? Then what's uncontroversial will more likely seem true?

Yeah, I get you there. As I said, I don't know the solution; I think it is hard, but I do think we have to figure it out. I think that there are places on the web that have a good record of truth. Wikipedia, for example, for all of its ability to be edited, is still more likely than not to be true.

So I think there's all these places that do signal true things. I think, you know, CNN is more likely to be true than Breitbart News. So, I mean, there has got to be some signal in the noise, some way to trace back references and things, right?

Some of the things we talked about were some of the things we discussed, where manual intervention or deeply searching, right, going back and looking at all these different sources—that's something that most humans don't do, and won't do, and can't do; don't have the time to do.

That's what I'm saying: a machine can look at references; a machine can look at signals that spread. You know, and then they can do that very rapidly. So you want to find out who these people are. Do they have a record of falsehood? Are they actually accredited at universities?

There are a variety of things that you can do to see whether things have a true signal or not. I don't think it's easy, but I think it's possible.

Yes. There are traditional news medias that may present the news or these articles, fake or not, to the public. Don't you think they sort of had part of the blame because they fact-checked those fake articles when the results already were in?

Like, all these statements were said, and then when the results were in—say, in the last election—they just want, "Oh wait, these statements were, were they true? Oh wait, they weren't. But too late to alter the results from all statements."

Yeah, but I think there was a lot of light fact-checking, but it didn't seem to do anything. Like, people didn't seem to care about facts. And that deep down is the real question I want to get the answer to, which is why don't people seem to care what's true?

And I guess maybe rather than putting it that way, there were different camps which each had their own narratives and their own things that they believed were true, and it was different from the other side. So figuring out what is true was really hard, especially if you were completely in one camp. I mean, there were fake things around on both sides.

More Articles

View All
How India Influenced South African Cuisine | Gordon Ramsay: Uncharted
[Narrator] Gordon Ramsay is heading to Kwazulu-Natal, South Africa, to learn the secrets of Zulu cuisine. But how the region developed some of its signature dishes reveals a deeper and darker history. In just one stroll through a spice market in Durban, S…
Watch: An Incredible Viking Voyage—Made Entirely of Paper | National Geographic
I am old, but I remember long ago when we Norsemen ruled the sea. As our northern kingdom expanded, the secret of our success lay in how we built our fearsome longships. Imagine a young boy named Harold who yearns to see the world. His father is a shipbu…
Lecture 1 - How to Start a Startup (Sam Altman, Dustin Moskovitz)
Welcome. Um, can they turn this on? Maybe all right. Uh, people here in the back, can you guys hear me? Is the mic on? No? Uh, maybe you can ask them to turn it on. Maybe we can get a bigger—ah, there we go. All right. Maybe we can get a bigger auditorium…
Mustache Maintenance - Fan Questions | StarTalk
[Music] I’ve never in my life shaved my mustache. I’ve trimmed it, but I’ve never—a razor has never touched my upper lip in my entire life. So, two things are true: there’s no hair growth between like every pair of my thing and the bottom of my nose—I do…
There Are No Get Rich Quick Schemes
We skipped one tweet because I wanted to cover all of the tweets on the topic of the long term. The tweet that we skipped was, “There are no get-rich-quick schemes; that’s just someone else getting rich off you.” This goes back to the world being an effi…
Examples of linear and exponential relationships
So I have two different XY relationships being described here, and what I would like to do in this video is figure out whether each of these relationships, whether they are either linear relationships, exponential relationships, or neither. And like alway…