yego.me
💡 Stop wasting time. Read Youtube instead of watch. Download Chrome Extension

Why Anecdotes Trump Data


6m read
·Nov 10, 2024

Some critics of the TV show Mythbusters claim that the show misrepresents the scientific process. For example, experiments are sometimes conducted only once and without adequate controls, but then these results are generalized to make definitive claims rather than repeating the experiment and using statistical analysis as a scientist would to figure out what is really true. So, ironically, a show that is meant to educate people about science may instead be giving them the opposite impression of how science works.

But you know, similar criticisms had been made of Veritasium. For example, when Destin and I performed our experiments to show that the water swirls the opposite way in the northern and southern hemispheres, we only showed doing it once even though we each did it three or four times in our own hemisphere. And I guess that brings forth the question: should we change what we're doing? I mean, should Mythbusters and Veritasium really show the repetitive nature of science and use statistical results as evidence for our claims? Well, my answer is no, but to understand why, we first have to dig into something called the helping experiment.

And this was performed in New York in the 1960s. The way it went was: individual participants were placed in isolated booths where they could speak to five other participants through an intercom, but only one mic was live at a time. These participants were meant to speak in turns for two minutes each about their lives—any problems they were having—and it would just go in rounds. Now, what the participants didn't know was that one of them was actually an actor who was reading a script prepared for him by the experimenters. He went first in the first round. He talked about the problems he was having adjusting to life in New York City and particularly the difficulty that he gets the seizures, particularly when stressed.

So everyone else had their turn, and then it came back 'round to this actor again. Now this time, when he was speaking, he became more and more incoherent as he was talking. He said that he could feel a seizure coming on, and he made choking noises. He asked for help from the other participants—he said he felt like he was dying—and, uh, then he continued to get more and more distressed until his mic went off. The point of the experiment was to see how many of the participants would help. I mean, if you were one of the other participants, do you think you would've left your booth and gone to see if he was okay?

In total, about 13 participants took part in this experiment, and the number that helped before his mic was turned off was just four. Now, while this might sound a little bit disappointing about the state of human helpfulness, you gotta keep in mind that there were other people listening to the same distress call, and that may have diffused the responsibility that individuals would feel; this is something known as the "bystander effect." Now, what's interesting about this experiment from my point of view is not how it confirms the bystander effect, but in how people view the results. For example, they fail to change their opinion of themselves or others after learning about this experiment.

For example, have you changed your opinion about how likely you would be to help in this situation—now that you know that only 30 percent of people did in that situation? Well, there was a follow-up study conducted where students were shown two videos of individual participants who were purported to be from the original study. They had already learned about the study, and then they were asked at the end of watching those two videos— which were pretty uninformative, just showed that these people were good, decent, ordinary people—these students were asked, "How likely do you think it was that those two particular participants helped?"

And overwhelmingly, students felt that those two participants would have helped—even though they knew that statistically only 30 percent did. So, in fact, it would've been a better guess to say that they probably didn't. They didn't seem to really internalize those general results as pertaining to the particular; they kind of assumed it excluded ordinary, good, decent people. Now, is there a way to get people to really understand how the world works? Well, they did another follow-up study where they talked about the experiment, described the experiment, but didn't give the results.

Then they showed those two participant videos—again, not mentioning anything about the experiment, just showing that these are two decent, ordinary people—and then they told the students that those two people did not help in the experiment. They asked the students to, uh, guess what proportion of people did help. In this case, when they were going from those particular examples of ordinary, nice people who didn't help, they were much better at generalizing to the overall result, to the statistical result. In fact, they got it basically right.

I think this highlights for us that our brains are much better at working with individual stories and things in detail than they are with statistical results. That is why I think if you're Mythbusters or Veritasium, it's better to communicate science—to tell the story, to show the experiment really once in a dramatic way—rather than three or four times where each new iteration—well, each repetition—just confirms the original result that you were talking about. But if you're actually doing the science, if you're actually trying to establish scientific fact—then of course you need the repetition and the statistical analysis. So I think it really does come down to what your objectives are.

With this conclusion, I think this opens up two big potential pitfalls. One is that people without scientific evidence can make crafty stories that catch on and quickly become what people feel is the truth. The other pitfall is scientists who have strong scientific evidence—who have clear statistical results—and yet they can't communicate them to people because they don't have a great story. An example of the first pitfall is the recent spread of this rumor that the outbreak of a birth defect microcephaly in South America was actually caused by a larvicide made by Monsanto. That story caught on like wildfire.

You can see why because it's got this clear villain—that everyone loves to hate—in Monsanto. It's got a really causal story that someone is doing something bad to the water—and it's this poison that we're poisoning ourselves with, and it's a very emotive, clear story. While the other story is, uh, well, it's a little bit more statistical—that there is some kind of connection—which is the scientific consensus that the Zika virus, carried by these, uh, mosquitoes, is causing the microcephaly. There are strong indications that that really is what's happening.

If you look at the claims about the larvicide, they really don't hold much weight. I mean, the larvicide is so weak that you could drink a thousand liters of it a day—a thousand liters of the water treated with this larvicide—and have no adverse effects. Or, uh, this larvicide has been used in dog and cat pet collars. So really, you know, there isn't strong evidence for the larvicide connection. In fact, there is no connection between the larvicide and Monsanto at all. But I think the story took hold because it had such a strong narrative.

On the other hand, you have things like climate change, which have very strong statistical evidence to back them up—large-scale results over the globe. Yet, one cold snowy winter is so much more, uh, visceral and meaningful to individual people than this thing which feels, you know, completely data-based. It just depends on how much you trust data, I guess. As scientists, we love data. We feel that if we're trying to communicate to someone, we're trying to convince someone of something, all we have to do is show more data.

But what experiments demonstrate to us with statistical certainty is that stories work much better. Normally I do these walk-and-talk self-videos on my second channel, 2Veritasium, but I imagine that some of you might not know that that exists. So, I thought I'd put one of these here on 1Veritasium. Plus, this one has a fair amount of data and, you know, experimental stuff in it, so I figured that could be interesting for you as well. So if you want to check out the check... [CHUCKLES] So if you want to check out the second channel, then go to 2Veritasium. I'll put a card or something for it up here.

More Articles

View All
Properties of the equilibrium constant | Equilibrium | AP Chemistry | Khan Academy
An equilibrium constant has one value for a particular reaction at a certain temperature. For example, for this reaction, we have oxalic acid turning into two H plus ions and the oxalate anion. The equilibrium constant Kc for this reaction is equal to 3.8…
Sam Altman - Startup Investor School Day 1
I’m going to turn it over to our first speaker, Sam Altman, the president of Y Combinator, who actually had the original idea for this course, so I’m pretty grateful for that. He’s also the man who has said, “You want to sound crazy, but you want to actua…
You're not boring : How to awaken your creativity
When it comes to creativity, for some reason, most people take creativity as something that you’re born with, as something like a talent. Most of us think either we haven’t or we don’t. But in reality, creativity is something that you’ve worked for and th…
Introduction to nouns | The parts of speech | Grammar | Khan Academy
Hello grammarians! Welcome to the English parts of speech. We’re going to begin with the noun, the lovely, wonderful noun—your friend and mine. They’re mostly what you’re going to encounter in sentences. Most sentences in English contain at least one noun…
Visit Her at Your Peril | Barkskins
[birds chirping] You are Mari, the housekeeper. He’s told me of you. [thud] Some creatures must go back to go wild, it seems. Monsieur Trepagny smashes them with his stick at night, and they know to stay away from our bed. He does have dominion over all. …
Can You Overdose on Vitamins?
If you eat just one carrot every day, that provides all the vitamin A you need to survive. But in some parts of the world, that’s not easy to come by. There are an estimated 250 million preschool-aged children who are vitamin A deficient. Most of them are…