The Black Swan Theory
You are a chicken. Yes, you. You look around and sometimes wonder why your owner takes such good care of you. At first, you're not sure; you're skeptical. What if he sends you to the slaughterhouse? You've never been there, but you know very well none of your friends have ever come out of that place. You remain on high alert for when that fateful day might arrive. But it never does.
Days go by, and then weeks, months, even years. You are now convinced your owner loves you more than any of these other chickens, and he would never do anything bad to you. Each passing day is additional evidence to say that you will live for the next thousand days. A thousand days go by like this, a thousand beautiful days, until of course the thousand and first day when the illusion of safety breaks, and you end up on someone's dinner plate. You should have never crossed the road.
Now imagine how betrayed the chicken must have felt when it was being taken to that terrifying part of the farm. Given the thousand days' worth of evidence, the chicken's trust in its owner was ironically at its highest level when it was eventually slaughtered. Perhaps if it wasn't so foolish to believe that it was special or unique, maybe it would have at least been spared the feelings of betrayal. That one final day completely changed the outlook of the chicken's life. That one piece of evidence outweighed the previous thousand days, and it's not even a contest.
This is something known as a black swan; a single event or observation that comes as a surprise, with disproportionate consequences, radically changing our outlook about something. People used to think that swans could only be white until they saw a black swan, which basically reshaped the way people thought about what is out there. The scene Nicholas Taleb wrote a book called "The Black Swan: The Impact of the Highly Improbable" to study this very phenomenon and shine light on how vulnerable we are to black swans and how we are only becoming increasingly more vulnerable with each passing day.
In his book, he talks about some fundamentals of epistemology that limit our ability to understand the black swans before they happen. But first, let's talk about why our modern society, as technologically advanced as it is, is the perfect nesting place for a black swan event. Let's say we're going to weigh a few thousand people, and at the extreme end of that sample contains the heaviest person in the world. So long as that person is subject to biological constraints like the rest of us, it doesn't really matter how much he or she weighs. Let's say two thousand pounds.
Now how much do you think that accounts for in the total weight of all the people we weighed? The answer is probably less than half a percent. It shows that even a crazy outlier like a two thousand-pound person doesn't really overwhelm the average. Taleb calls this ecosystem Mediocristan to refer to how the mediocre measurements of the average person do mostly represent all measurements quite well.
Now let's conduct the same experiment, but with wealth. Let's gather a few people and include just one of the 3,000 or so billionaires in that list. How much do you think that billionaire accounts for in the total wealth of all the people in that sample? An overwhelming majority, almost always close to 99%. Contrary to the first scenario, here the outlier overwhelms everything else. Taleb calls this world Extremistan, as it rewards a few people extremely well and leaves basically nothing for the others.
Taleb says that the modern world is composed of circumstances that are geared towards Extremistan, not Mediocristan. Because money, for all intents and purposes, is just a number in someone's book, the vast majority of money is completely digital. It's not subject to physics laws or biology to constrain it to minimal variance. Sure, most people don't make that much money, but a few people can make a lot of money. Similarly, if you want to consider musicians, most musicians don't sell that many albums, but a few artists sell quite a few.
You can conduct the same thought experiment with book sales, scientific publications, shoe brands, and so on. The point is the modern economy is very much a winner-take-all system that rewards a very small number of people with a disproportionately large portion of the pie. If it was more like the weight example we just talked about, you wouldn't expect the outliers to be so wild.
But the fact that they really are indeed so wild just goes to show how unpredictable the environment we're living in really is. The forecasts we take for granted today often fail to take into account the true nature of this unpredictability—these black swan events. You might be inclined to say that no, these billionaires put in the work day in and day out, and therefore they can enjoy the fruits of their labor. Indeed, most of them probably worked really hard; some of their innovations might later pave the way for a better future for all of us.
I'm not discounting that; however, the system is not rewarding them proportionately. More importantly, it's hard to say how much of their efforts are the fruits of their labor and how much of it is due to pure chance. If you were to run a few simulations with extremists in type circumstances, you would inevitably have a few Jeff Bezos-like outliers. We may be biased into thinking that we understand what causes Bezos-like outliers in our society: you know, the usual, think-out-of-the-box, start a revolutionary company, work extremely hard for a few years, and then smell the roses, happily ever after.
We've all read the autobiographies. We've all watched the documentaries. However, when was the last time you read about a person who did all of those things and failed? When was the last time you saw shelves of books about people who failed? Chances are probably never. These stories just never really quite make it. There is an epistemic bias in all of this, Taleb says.
Now take a look at the cemetery. It is quite difficult to do so because people who fail don't seem to write memoirs, and if they did, those business publishers I know would not even consider giving them the courtesy of a returned phone call. This is despite the fact that often advice about what not to do is more useful than what to do. But that's just the economy; that's just one facet of society.
We also don't understand the socio-political aspects. Take 9/11 for example, which is certainly a black swan event. After it happened, you had tons of experts come out and say that they had known for years that it was about to happen. Well, why didn't they say anything? This retrospective distortion of the understanding of a problem is one of the hallmarks of a black swan event. None of them really knew. If they did, cockpit doors would have been bulletproof long ago, pocket knives would have never been allowed in the cabins, and the TSA would have been invented much earlier.
But these things were only instituted after 9/11. If you were to suggest such policies in 1991, for example, you would probably not be taken too seriously or would have been shown a spreadsheet that suggested airlines don't have the money for bulletproof doors. But inevitably they did. Thankfully, the likelihood of a 9/11-style event is much lower now than it used to be. Countries around the world are more prepared, more vigilant. However, that also makes these precautions somewhat lose their relevance.
Yuval Noah Harari in his book "Homo Deus" cites a paradox about knowledge. He says knowledge that does not change behavior is useless; but knowledge that changes behavior loses its relevance. The more data we have and the better we understand history, the faster history alters its course, and the faster our knowledge becomes outdated. Despite the measures we have taken for a black swan event like 9/11, that does nothing to improve our odds against a future black swan.
If anything, it might lure us into a false sense of security and, in fact, worsen our chances of coping with the impacts of the next highly improbable event. We tend to convince ourselves that we understand risks once we have understood a game of dice or blackjack. However, trying to approximate the risks in real life with the same methods used in a closed-loop artificial game is simply oversimplification—a mistake that we commit daily. Taleb calls this the ludic fallacy.
We learn simple games and immediately conclude that the stock market works in the same way, even though one of these things lives in Mediocristan and the other lives in Extremistan. If the markets were well understood, do you think something like GameStop or AMC would have ever been allowed to happen? Sure, short squeezing is not a particularly new phenomenon, and yet even a non-black swan event such as this one left even the smartest hedge fund managers scratching their heads and practically chasing bankruptcy.
This false sense of understanding makes black swans that much more dangerous. There are other reasons why we are increasingly more vulnerable to black swans, Taleb says. Whereas in the past, people might have been studying different kinds of literature and diving deep into a locally developed set of ideas, today, arguably, the most read book is Harry Potter.
That's of course not to say Harry Potter is a bad book or anything, but it goes to show that we are much less in tune with each other's ideas, for better or for worse. For the most part, everyone is dealing with generally the same ideas. That coupled with the rising complexity and reach of technology means when something fails, it fails for more people than ever before. The Pakistani government tried to shut down YouTube in Pakistan; it ended up shutting down YouTube worldwide.
We don't understand these things. That's just one way for technology to fail, but it goes to show how interconnected things are. And while that is often touted as a plus, given sufficiently poor luck, that can really spell doom for us all. Take coronal mass ejections as an example. These are regular bursts of radiation from the sun that scientists on earth know and expect. The largest coronal mass ejection ever on record is the Carrington event in 1859. Its effects were mostly felt by telegraph operators who had some of their equipment burnt from the sudden surge. Most of the world went on without a hitch.
On the other hand, if a Carrington-class event were to occur today, with all the grids, electric cars, and equipment that we now have, the damages would be in the trillions of dollars, and repair could take decades, if at all possible. With each passing day, with each little transition into an electric future, we're becoming more and more vulnerable to such an event. The thing is this isn't even a black swan event.
In 2012, the likelihood of a Carrington event in the next decade was calculated to be around 12%. And yet, despite that high probability, we're not particularly prepared for such an event, given the esoteric nature of its risk—seemingly low probability but high impact. Despite all the mounting evidence, you'll have a very hard time convincing governments to make modifications to power grids to avoid catastrophic failures. So if that's how little we care about an event that we know is bound to occur eventually, imagine how unaware we are of a true black swan.
The chicken in the farm, were it to somehow be spared by some miracle, would never trust another human being ever after the betrayal it endured. However, few are ever so lucky. Meanwhile, for the owner, the chicken's death comes as no surprise; it is a routine event and therefore no black swan. The idea of a black swan is therefore relative to the knowledge one possesses. Hence, our objective is to try and be in the position of the butcher, not the butchered.
Taleb says, "I worry less about advertised and sensational risks, more about the vicious hidden ones." Of course, the idea of a black swan also incorporates good things, such as wildly unlikely positive outcomes of chance, otherwise known as life. The odds of being born are 1 in 400 trillion, but to be fair, I just unfollowed my own advice. Such a thing can't really be predicted, can it? For all we know, and for all we don't, being born is an unimaginably unlikely event that nobody really predicted.
So if you are alive, whatever that means. In the end, we're actually all the black swans we've been trying to avoid the entire time. Ironic, isn't it? Black swan events are by nature unforeseen and unavoidable. It's almost anxiety-inducing knowing that at any moment a black swan event could happen, changing the landscape of everything as we know it.
The history of over a thousand days tells you absolutely nothing about what is to happen next. Probability is everything, and with Brilliant, you can not only learn about probability but also some of the most pressing and interesting topics in STEM. Recently, Brilliant has improved the interactivity of some of their best courses with their newly redesigned Mathematical Fundamentals course. You can learn the foundational ideas of algebra, number theory, and logic that come up in nearly every single topic in STEM.
After that, I'd recommend you check out their courses on probability. There's a bunch of them, but I honestly believe that learning to think of things in terms of probability will give you a much better perspective on life. We all learn differently, but I think we can all agree that the best way to truly learn and retain something is by doing it. With Brilliant's interactive courses, you can learn a new skill or topic in as little as 10 minutes.
Brilliant has courses in math, science, and computer science for learners of every skill level—beginner or expert. If you're interested in becoming a smarter person overall, visit brilliant.org/aperture to start learning for free. You'll also get 20% off a premium subscription, which will unlock every single course Brilliant has to offer. As always, you'll be supporting yourself and my channel at the same time.