yego.me
💡 Stop wasting time. Read Youtube instead of watch. Download Chrome Extension

Algorithms are Destroying Society


3m read
·Nov 4, 2024

Processing might take a few minutes. Refresh later.

In 2013, Eric Loomis was pulled over by the police for driving a car that had been used in a shooting—a shooting, mind you, that he wasn't involved in at all. After getting arrested and taken to court, he pleaded guilty to attempting to flee an officer and no contest to operating a vehicle without the owner's permission. His crimes didn't mandate prison time; yet, he was given an 11-year sentence, with six of those years to be served behind bars and the remaining five under extended supervision. Not because of the decision of a judge or jury of his peers, but because an algorithm said so.

The judge in charge of Mr. Loomis's case determined that he had a high risk of recidivism through the use of the Correctional Officer Management Profiling for Alternative Sanctions Risk Assessment Algorithm, or COMPAS. Without questioning the decision of the algorithm, Loomis was denied probation and incarcerated for a crime that usually wouldn't carry any time at all. What has society become if we can leave the fate of a person's life in the hands of an algorithm? When we take the recommendation of a machine as truth, even when it seems so unreasonable and inhumane?

Even more disturbing is the fact that the general public doesn't know how COMPAS works. The engineers behind it have refused to disclose how it makes recommendations and are not obliged to by any existing law. Yet we're all supposed to finally trust and adhere to everything it says. Reading about the story, a few important questions come to mind: How much do algorithms control our lives, and ultimately, can we trust them?

It's been roughly ten years since Eric Loomis's sentencing, and algorithms now have a far greater penetration into our daily life. From the time you wake up to the time you go to bed, you're constantly interacting with tens, maybe even hundreds, of algorithms. Let's say you wake up, tap open your screen, and do a quick search for a place near you to eat breakfast. In this one act, you're triggering Google's complex algorithm that matches your keywords to websites and blog posts to show you answers that are most relevant to you.

When you click on a website, an algorithm is used to serve you ads on the side of the page. Those ads might be products you've searched for before, stores near your location, or, oddly enough, something you've only spoken to someone about. You then try to message a friend to join you for your meal. When you open any social media app today, your feed no longer simply displays the most recent post by people you follow. Instead, what you see can be best described by TikTok's For You page; complex mathematical equations behind the scenes decide what posts are most relevant to you based on your view history on the platform.

YouTube, Twitter, Facebook, and most notoriously TikTok all use these recommendation systems to get you to interact with the content that their machines think is right for you. And it's not just social media—Netflix emails you recommendations for movies to watch based on what you've already seen. Amazon suggests products based on what you previously bought.

And probably the most sinister of all, Tinder recommends you the person you're supposed to spend the rest of your life with, or at least that night. These might seem like trivial matters, but it's more than that. Algorithms are also used to determine who needs more health care and when you have your day in court and a computer program decides whether you'd spend the next decade of your life behind bars for a crime that usually doesn't carry any time.

One of the most dangerous things about algorithms is the data that is used to power them. Because the more data you feed into an algorithm, the better its results. Then where do companies get this data? It's from their users, like you and me. Most of the time, giving out this information is harmless, but a lot of times these companies sell your information to data brokers, who then sell that data to other companies that want to sell you stuff. That's why you keep getting targeted ads from random companies you've never heard of before.

And what's worse is that these data brokers are often tar...

More Articles

View All
Diving With Bullsharks | World's Biggest Bullshark
Neil and James search a shipwreck for the mega shark, Big Bull, and other large bull sharks that may be her descendants. “I’m good over here,” Neil says. Bull sharks have a pack-like mentality and swarm around possible food. They surround the divers. Ne…
Our Water Footprint | Breakthrough
Water is finite, but our demands for it are not. So in places where we have rivers running dry, what’s happening is our demands are bumping up against those limits of the finite supply. Our use of water for agriculture, for food production, for growing ci…
Into the Wilderness: Trapping a Wolf | Life Below Zero
♪ [Ricko] We have to hunt and kill to survive. Just like the animals out here. ♪ ♪ ♪ ♪ Most likely the wolves came along and hamstringed it, or they’re right around here somewhere. I’m traveling along with my snow machine, looking for a place to do some w…
Dating apps are more dangerous than you think
A couple of weeks ago, I was having dinner with a friend and overheard what had to be a first date at the table right next to us. The conversation was awkward at first, as they both seemed to struggle to get a good flow going. I looked over a bit later to…
npage85: knowing the fundamental character of X
And page 85 made a video called “The Brain Doesn’t Create the Mind.” In it, he tried to use a deductive argument to prove the existence of souls. It went like this: Premise one: All fundamentally same processes create fundamentally same products. Premis…
Shopping For Affordable Watches With Teddy Baldassarre
Teddy’s learning he’s the grasshopper; he’s learning from the master. That’s the way I look at it. [Laughter] Garbage! You know, when you’re a fashionista like me, you can pick style out five yards away already. I’m kicking Teddy’s ass here; this is amazi…