Algorithms are Destroying Society
In 2013, Eric Loomis was pulled over by the police for driving a car that had been used in a shooting—a shooting, mind you, that he wasn't involved in at all. After getting arrested and taken to court, he pleaded guilty to attempting to flee an officer and no contest to operating a vehicle without the owner's permission. His crimes didn't mandate prison time; yet, he was given an 11-year sentence, with six of those years to be served behind bars and the remaining five under extended supervision. Not because of the decision of a judge or jury of his peers, but because an algorithm said so.
The judge in charge of Mr. Loomis's case determined that he had a high risk of recidivism through the use of the Correctional Officer Management Profiling for Alternative Sanctions Risk Assessment Algorithm, or COMPAS. Without questioning the decision of the algorithm, Loomis was denied probation and incarcerated for a crime that usually wouldn't carry any time at all. What has society become if we can leave the fate of a person's life in the hands of an algorithm? When we take the recommendation of a machine as truth, even when it seems so unreasonable and inhumane?
Even more disturbing is the fact that the general public doesn't know how COMPAS works. The engineers behind it have refused to disclose how it makes recommendations and are not obliged to by any existing law. Yet we're all supposed to finally trust and adhere to everything it says. Reading about the story, a few important questions come to mind: How much do algorithms control our lives, and ultimately, can we trust them?
It's been roughly ten years since Eric Loomis's sentencing, and algorithms now have a far greater penetration into our daily life. From the time you wake up to the time you go to bed, you're constantly interacting with tens, maybe even hundreds, of algorithms. Let's say you wake up, tap open your screen, and do a quick search for a place near you to eat breakfast. In this one act, you're triggering Google's complex algorithm that matches your keywords to websites and blog posts to show you answers that are most relevant to you.
When you click on a website, an algorithm is used to serve you ads on the side of the page. Those ads might be products you've searched for before, stores near your location, or, oddly enough, something you've only spoken to someone about. You then try to message a friend to join you for your meal. When you open any social media app today, your feed no longer simply displays the most recent post by people you follow. Instead, what you see can be best described by TikTok's For You page; complex mathematical equations behind the scenes decide what posts are most relevant to you based on your view history on the platform.
YouTube, Twitter, Facebook, and most notoriously TikTok all use these recommendation systems to get you to interact with the content that their machines think is right for you. And it's not just social media—Netflix emails you recommendations for movies to watch based on what you've already seen. Amazon suggests products based on what you previously bought.
And probably the most sinister of all, Tinder recommends you the person you're supposed to spend the rest of your life with, or at least that night. These might seem like trivial matters, but it's more than that. Algorithms are also used to determine who needs more health care and when you have your day in court and a computer program decides whether you'd spend the next decade of your life behind bars for a crime that usually doesn't carry any time.
One of the most dangerous things about algorithms is the data that is used to power them. Because the more data you feed into an algorithm, the better its results. Then where do companies get this data? It's from their users, like you and me. Most of the time, giving out this information is harmless, but a lot of times these companies sell your information to data brokers, who then sell that data to other companies that want to sell you stuff. That's why you keep getting targeted ads from random companies you've never heard of before.
And what's worse is that these data brokers are often tar...