yego.me
💡 Stop wasting time. Read Youtube instead of watch. Download Chrome Extension

Algorithms are Destroying Society


3m read
·Nov 4, 2024

Processing might take a few minutes. Refresh later.

In 2013, Eric Loomis was pulled over by the police for driving a car that had been used in a shooting—a shooting, mind you, that he wasn't involved in at all. After getting arrested and taken to court, he pleaded guilty to attempting to flee an officer and no contest to operating a vehicle without the owner's permission. His crimes didn't mandate prison time; yet, he was given an 11-year sentence, with six of those years to be served behind bars and the remaining five under extended supervision. Not because of the decision of a judge or jury of his peers, but because an algorithm said so.

The judge in charge of Mr. Loomis's case determined that he had a high risk of recidivism through the use of the Correctional Officer Management Profiling for Alternative Sanctions Risk Assessment Algorithm, or COMPAS. Without questioning the decision of the algorithm, Loomis was denied probation and incarcerated for a crime that usually wouldn't carry any time at all. What has society become if we can leave the fate of a person's life in the hands of an algorithm? When we take the recommendation of a machine as truth, even when it seems so unreasonable and inhumane?

Even more disturbing is the fact that the general public doesn't know how COMPAS works. The engineers behind it have refused to disclose how it makes recommendations and are not obliged to by any existing law. Yet we're all supposed to finally trust and adhere to everything it says. Reading about the story, a few important questions come to mind: How much do algorithms control our lives, and ultimately, can we trust them?

It's been roughly ten years since Eric Loomis's sentencing, and algorithms now have a far greater penetration into our daily life. From the time you wake up to the time you go to bed, you're constantly interacting with tens, maybe even hundreds, of algorithms. Let's say you wake up, tap open your screen, and do a quick search for a place near you to eat breakfast. In this one act, you're triggering Google's complex algorithm that matches your keywords to websites and blog posts to show you answers that are most relevant to you.

When you click on a website, an algorithm is used to serve you ads on the side of the page. Those ads might be products you've searched for before, stores near your location, or, oddly enough, something you've only spoken to someone about. You then try to message a friend to join you for your meal. When you open any social media app today, your feed no longer simply displays the most recent post by people you follow. Instead, what you see can be best described by TikTok's For You page; complex mathematical equations behind the scenes decide what posts are most relevant to you based on your view history on the platform.

YouTube, Twitter, Facebook, and most notoriously TikTok all use these recommendation systems to get you to interact with the content that their machines think is right for you. And it's not just social media—Netflix emails you recommendations for movies to watch based on what you've already seen. Amazon suggests products based on what you previously bought.

And probably the most sinister of all, Tinder recommends you the person you're supposed to spend the rest of your life with, or at least that night. These might seem like trivial matters, but it's more than that. Algorithms are also used to determine who needs more health care and when you have your day in court and a computer program decides whether you'd spend the next decade of your life behind bars for a crime that usually doesn't carry any time.

One of the most dangerous things about algorithms is the data that is used to power them. Because the more data you feed into an algorithm, the better its results. Then where do companies get this data? It's from their users, like you and me. Most of the time, giving out this information is harmless, but a lot of times these companies sell your information to data brokers, who then sell that data to other companies that want to sell you stuff. That's why you keep getting targeted ads from random companies you've never heard of before.

And what's worse is that these data brokers are often tar...

More Articles

View All
How I Built 7 Income Streams at 23 That Retired My Parents
Everyone’s talking about building multiple income streams, jumping side hustle to side Hustle, but here’s what nobody’s telling you: In today’s AI driven economy, being average at multiple things is actually the riskiest position you can be in. Instead of…
This Duck Has a Foot Growing On Its Head - Smarter Every Day 25
Hey, it’s me Destin. This week I’ve been in the lab, or my garage, working on my thesis. So, I’m trying to finish it, so I can’t give you an awesome video this week. To hold you over, I’ll give you some video of when me and my daughter went to the fair an…
Bubbling Disaster | Science of Stupid
Cracking open a bottle of bubbly isn’t just for F1 drivers and stock brokers; it’s also the perfect way to kick off a Christmas party. But like F1 drivers and stock brokers, champagne bottles are under an awful lot of pressure—around six times normal atmo…
Michael Burry: The next huge crash is coming soon | This is his stock portfolio
Michael Burry hasn’t been shy about saying that the stock market is extremely overvalued and on the brink of collapse. This is the same investor who became a legend by accurately predicting and betting on a different crash: the crash of the U.S. housing m…
Interpreting graphs with slices | Multivariable calculus | Khan Academy
So in the last video, I described how to interpret three-dimensional graphs. I have another three-dimensional graph here; it’s a very bumpy guy. This happens to be the graph of the function ( f(x,y) = \cos(x) \cdot \sin(y) ). You know, I could also say th…
Elon Musk Pleads "Vote Like Your Life Depends on It!" and Speaks on Why He Became Politically Active
The Jogan experience, and, uh, it’s meant to be you’re roasting the president. Like Trump’s just there, he’s like, actually, you know, just, he’s like there as part of the support. And then they turned it around and just started roasting Trump, and he’s j…