yego.me
💡 Stop wasting time. Read Youtube instead of watch. Download Chrome Extension

The For You Page Has Ruined Society Forever


22m read
·Nov 4, 2024

Every choice you've ever made is a result of the combination of all the experiences you've had, things you've learned, and people you've met. So, what happens when an algorithm designed to make the most money for corporations decides the experiences you have, things you learn, and people you meet for you?

The page on social media platforms like TikTok acts like a warped mirror reflecting back a distorted vision of reality. As you bask in your reflection, you're slowly drawn into something sinister: a digital mirror that starts with truths but slowly manipulates your desires and beliefs. As trends and viral content shape your perception, you might find yourself wondering where certain thoughts come from, leading you further down a rabbit hole.

Well, we found what's at the end of the hole, and it isn't pretty. Today, you'll learn the science behind algorithms, the disturbing truth about echo chambers, and the final act of thunderous applause we're all heading toward.

Picture this: you're in a massive control room filled with supercomputers and fancy sensors. In front of you is your trusty smartphone displaying your social media feed. It's the typical setup where your boredom reaches its peak—time to scroll for a bit. Every time you hover over a video for a split second longer than usual, the computer records it. Every time you double-tap, it's recorded. Every share, comment, and even half-scroll is recorded. It's almost like each trace of hesitance or decisiveness is recognized by the machine.

But why? This isn't Skynet; it's just memes and things you like. Well, it's all done to build a profile of you. This profile version of you works based on something instinctive, maybe even primal. Generally, when you interact with your feed, it's not like there's any anxiety or thought going into it. You're just doing what feels natural.

Here's the twist, though: soon enough, the computer figures out what makes you tick. It hacks your algorithm first. Once that's done, the next step in the takeover begins, and it's not just about throwing stuff in that you like. It's about gaining your trust, keeping you watching, and bringing in the end of free will.

The TikTok For You page, or FYP, is a prime example of modern algorithms and the power they hold in the 21st century. Before the For You page came along, algorithms generally felt like some mysterious back-end way to control users, mostly shrouded in secrecy. We finally got a public idea of how this stuff works in 2021 when we saw major companies like Meta scrambling in the U.S. Supreme Court attempting to explain how their algorithms functioned. Sure, that enlightened the public a bit; however, it wasn't until TikTok's boom that most people got a real look at algorithmic supremacy and the dangers we stand to face.

It's like a new arms and space race, but we are the nuclear arsenal, and the puppet masters can't wait to throw us at ourselves. Algorithms offer an advanced way to blur the lines between personalization and manipulation. They first analyze the stuff you've interacted with, then they proceed to predict what you will interact with. Remember, it's not a human; it's a diligent and scarily efficient machine.

It's easy to fall into the trap of mindless scrolling, consuming content that won't add any meaningful value to your life. But the first step to preventing brain rot is to take control of the content you consume and your online experience in general. Taking control of your online experience means being able to watch what you want, when you want, without being limited by geography or getting blocked by your ISP.

That's where today's sponsor, NordVPN, comes in. With NordVPN, you can take full control of your online experience. If you want to stay home online when you're traveling, you can do it in one click. I was recently in the UK for work when I realized Gladiator wasn't available on Netflix, and I used NordVPN to switch my location back to the U.S. and I was able to watch the movie.

NordVPN also keeps all your internet traffic private and safe from malware, thanks to its Threat Protection Pro. So, you don't only have full control; you also have privacy and protection whenever you're online. You never have to worry about passwords, credit card information, your home address, and other personal details getting leaked. With apps for all major platforms—including macOS, Windows, iOS, and Android—and the ability to protect up to ten devices with just one account, you get complete control, freedom, and protection with NordVPN.

To try out NordVPN, go to nordvpn.com/aperture or click the link at the top of the description to get four months free on a two-year plan. If you don't like it, there's a 30-day money-back guarantee, so it's completely risk-free. So, we kind of have to try it out.

Back to our story for TikTok: two main things drive the algorithm: machine learning and behavioral data. There's a lot of underlying engineering behind all of it, but those concepts stand at the forefront. It's not just there to react to your actions; it's learning from you, refining its predictions, and improving. Unlike social media a decade ago, TikTok's algorithm is very aggressive. Older platforms like Facebook and pre-dumpster-fire Twitter were benevolent in the tech scene. They showed users content from people they follow and sprinkled in suggestions once in a while.

Meanwhile, on TikTok, the following tab simply exists to give you the illusion of choice. Don't believe me? Answer truthfully: how often do you find yourself scrolling through your TikTok following tab? Probably like 5% of the time you spend on the app—maybe less. Even crazier, the following tab is designed to feel subpar to your explore page. Once you've gone through like 15 videos on your following, it starts to recommend the same videos you saw five minutes ago.

Now, sure, you could refresh it hoping for a change, but at that point, the algorithm knows you'll pick the easier road, which is to swipe left and head back to the explore page. It's like: do you want to go back to the thing that just disappointed you, or give in and confide in the warm embrace of an algorithm that knows exactly what you should be seeing right now?

The explore page's prominence is all-encompassing. You're constantly exposed to videos that are hyper-tailored to certain aspects of its internal idea of you. Slowly but surely, it reinforces you into being comfortable with the idea that it has your convenience in mind. TikTok doesn't just care about the videos you like, though. Overt positivity is never going to give advertisers the most money, so trust that they are taking things a step further.

In this case, TikTok learns about your preferences, desires, vices, and even your fears. Let's break down the stuff they're watching. If you're an amateur TikTok creator, take this as a free mini crash course on what you should be looking out for.

The algorithm categorizes videos based on metrics like hashtags, metadata, and most recognizably, sounds. So, #FYP is a very redundant thing to have in your TikTok, just so you know. Instead, hashtags are best used for sorting content into categories like gaming, dance videos, cooking, or other sub-niches or interests. Next, we've got the metadata. In a nutshell, this is just data about data.

The computer has your information as a user, but it also has to make sense of all of it in order for the learning part of machine learning to actually matter. Metadata sorts all the information it gets from you and puts it into categories that are arranged internally. Another interesting way TikTok groups content is according to a user's emotional state.

If you're feeling sad, happy, anxious, or excited while watching certain types of videos, it records that data and tries to push content that matches your mood. With advanced emotion tracking features, TikTok is able to sense your facial expressions or tone of voice in videos. Think about it: when you watch content with a sad tone, you might sigh or have a solemn expression. TikTok actually keeps track of these moments through your camera and microphone, which makes it easier for the algorithm to push content matching your specific mood.

Finally, there's sounds. This one is pretty simple. Did you enjoy a video using a specific sound? TikTok will recommend videos with that sound to you even more. Why? Because sound tends to have a certain format behind their use, so it's easier for the algorithm to push successful videos that use the same ones.

All social media platforms tend to have some type of recommendation algorithm embedded into them; however, TikTok's For You page stands at the top due to some critical differences. The For You page is wicked fast; within just a few swipes, it can start tailoring your content to an eerie degree of efficiency. Meanwhile, platforms like Meta tend to take much longer to fine-tune what they show you.

That's probably because, unlike TikTok's quick-fire algorithm, they're actually obligated to care about the people you follow and relationships you've built over time. These older platforms have entire legacies based on user-curated selections. Now, the formula for content consumption has changed, so it's hard for them to keep up. On platforms such as YouTube and Facebook, users tend to come across suggesting content in random bits. There's no rush to it; at least that's how those platforms were in the past. Nowadays, they're trying to do the TikTok thing but with inferior algorithms.

Meanwhile, TikTok has condensed the entire experience into one giant endless scroll. Even scarier, every video you see is hyper-focused in its recommendation for you—no distractions, just bull's eyes.

All right, let's take a step back. The algorithm is all-knowing and all-powerful, at least as far as we've explained it thus far. But this content has been tailored towards your preferences. Sure, it's addicting, but that's not necessarily a world-ending type of bad. So, why should anyone care?

Here's the part where things take a darker turn. Your favorite actor just wore the most amazing outfit for a red carpet appearance. You are in awe. In that moment, you decide that you want a similar body type, so you hop on to TikTok and check out some dieting tips. You watch a few videos, save info in your notes app— the usual routine. Later that day, you open TikTok again, and it has recorded your diligent research focused on dieting. Your use of keywords was particularly insightful for the machine—weight loss, healthy diet, home exercises, and the like.

It can sense that you enjoy videos with noticeable changes in a short span of time, and it's ready to make things happen. Your algorithm begins to feed you more intense fitness content—none of the mild stuff. Before you know it, you're in a rabbit hole of extreme dieting content, not because you asked for it, but because you're hooked, and it knows that you can't help but watch.

The algorithm tends to manipulate your preferences by nudging you into content that might be more engaging but not necessarily beneficial. Subconsciously, your desires are reprogrammed without any conscious input. You don't just want to lose weight anymore; you suddenly find yourself interested in extreme weight loss methods from videos on your feed.

When you open TikTok, you're probably not there to take control. You're there to be entertained, and that's that. Even the act of tapping on the icon lights up your brain with dopamine—the same thing happens when you have a bite of your favorite dessert or ace a tough challenge. Each video you like is a quick hit.

This is where the For You page gets both brilliant and dangerous. The algorithm knows that unpredictable rewards tend to trigger more dopamine. It's like a magician pulling rabbits out of a hat, keeping your brain chasing that next rush, video after video. The longer you stay, the deeper the loop goes.

And here's where the problem lies: that endless loop doesn't challenge your brain or help it grow. You're being spoon-fed content that keeps you passive and engaged. It feels like you're learning, but there's something missing—you're not actually fulfilled. And worst of all, this loop can lead to deep-seated consequences.

It all starts off innocently enough, and there's no better example than this guy: meet Caleb Kane. In his early 20s, Caleb—like most men his age—was pretty lost. To let off some steam, he turned to YouTube, where he'd watch motivational videos and gaming content. While browsing, he came across Stefan Molyneux, a far-right American YouTuber who tends to get recommended when people watch videos from Joe Rogan or Jordan Peterson.

The video from Stefan was pretty tame—some content on self-improvement—and Caleb could relate to his difficult childhood. Slowly, though, things began to change. It wasn't just self-improvement anymore; the bait was cast and bitten, and Caleb found himself drawn to a recommendation algorithm that pushed him towards more radical creators. Eventually, he realized how far he had gone and pulled himself out of that rabbit hole. Not many young men are so lucky.

YouTube's radicalization pipeline has been subject to numerous studies, and experts have noticed a trend: young men seem to begin with harmless interests like Caleb. They begin watching innocent enough content centered on gaming, fitness, and motivation. Eventually, the algorithm learns what keeps them engaged, and they are exposed to more polarizing content.

Some people are able to resist and filter their feeds; however, the overwhelming majority are subject to a feed filled with controversial political ideas. It's not just random; it's made to feel like the natural evolution of each young man's interests. Overall, the alt-right pipeline isn't anything new, especially if you've been on the internet for a while. It's a topic that has been beaten to death, sure, but the horse still has a pulse, and maybe a few more smacks wouldn't hurt.

We've talked about the algorithm and how it keeps you hooked, but there's an even bigger issue beneath it all. It feels like there's a lot of pressure, and the pot is welded shut. The mental health toll of social media is an even more widespread issue. Dealing with your For You page can feel like you're trapped in a cycle where your self-worth is tied to external validation. If you're just a lurker or passive user, all the signs on your feed are designed to feel like your lifestyle is wrong.

It's like there's an invisible voice claiming that you should be posting consistently, sharing your life with strangers, and craving the external validation tied to views, likes, and shares. Now, if you're the type that enjoys posting, there's a lot to worry about. Plus, the bigger you get, the more pressure builds up. If you think you can handle the feedback loop, then go for it.

Here's the catch, though: external validation is addictive. The more you get, the more you crave, and soon you'll find yourself checking your phone constantly, waiting for the next like or comment to give you that quick hit. You can take it from experts. According to the Journal of Abnormal Psychology, teenagers who spend more time on social media are more likely to experience symptoms of depression and anxiety. If you're a teenager, there's no surprise in the slightest. It's just another fact, like hearing the sky is blue.

Here's the thing, though: even though it feels obvious, the impact on your mental health is real. That's why we'll explore real things you can try in order to get better.

All right, when was the last time you and your friends were watching the same stuff? Like sports? Nah, one of them's working too late. Marvel show? Well, Marvel's falling off, maybe later. Severance? Wait, people use Apple TV? Like, it's confusing. In real life, trying to stay on the same wavelength with your community feels almost impossible.

There's barely any chance for a water cooler moment anymore—a moment to discuss last night's big TV show or football game. Everyone's watched something different on their own. But you know where you can find people who watch your preferred content? That's right—social media. Cultural monoliths once bound us together, like it or not. Everyone had a common topic to discuss. Now, even in our households, it's like there's completely different realities taking place.

Someone's binge-watching cooking tutorials while the person in the next room is drowning in political memes and shocking headlines gone. Other days when social media was a place to follow like-minded people from your community, now even basic conversations with old friends feel like trying to decode a cryptic language. There's so many inside jokes you'll never understand simply because they're scrolling in a different timeline.

You've probably felt it too—a major shift that happened with the internet in 2016: the American presidential elections. Prior to those elections, America's divisiveness was mostly prevalent among older millennials and Baby Boomers. Eight years later, and it's like things have flipped upside down. Now, loud voices from the left and right dominate the discourse, debating ceaselessly over countless issues.

It's no longer about politics, but culture, identity, and everything in between. If we had to pick the most insidious effects of the For You page, echo chambers would be at the top of our list. Take a look at political content—left-leaning, right, or somewhere else. Once the machine learns your preferences, it starts to narrow your exposure, revealing a large amount of content that aligns with a specific political ideology.

Over time, this makes viewers develop a tough external shell—one that rejects opposing viewpoints and firmly keeps people rooted in their beliefs. It gets even worse, though. People who hold different opinions aren’t just worthy of disagreement; they should be distrusted. Along the way, you might start to question everyone's intelligence or morality.

Why does this happen? Well, most likely your For You page has led you down a worldview that props itself up as the objective truth in a valid direction. In extreme cases, this leads to radicalization, and at the least extreme, it leads to division. Make no mistake: the world has always been a place filled with polarization. Human beings enjoy a good debate—PlayStation or Xbox, Yanny or Laurel, the harmless stuff.

Nowadays, though, everything is at an extreme. With social media algorithms, the problem gets exacerbated, turning harmless differences into battlegrounds of belief.

All right, all right, do we sound a bit too centrist for your tastes? Allow us to paint a picture of radicalization. To do that, we'll have to talk about memes. Yes, memes! Quick-witted, funny, and spreading like wildfire, everybody loves a good meme. In just a few seconds, a meme can strongly convey an idea, thanks to the way they work.

The best memes are easy to digest. Yet memes aren't just light-hearted and fun, at least not all the time. They can be powerful tools of manipulation, and manipulation is a key word that repeats itself down this For You page rabbit hole. Most memes start off with a kernel of truth. They present information in a way that feels factual, even when it's oversimplified or outright false.

If you've played video games for a while, you're probably familiar with the term NPCs—non-playable characters. Around 2018, a meme started off where people who weren't part of a certain community were considered NPCs. It's nice to have your own little bubble where everyone else who doesn't get it feels like a random computer-generated character. It's relatable for gamers, and it fosters a sense of community with your fellow non-NPCs.

And now that there's an inside joke with the community, it's easier to influence the people within. Take it from the incel communities—what are the truths they begin with? Let's be honest: dating in the modern world can be difficult, and many people—particularly guys—tend to deal with a fair amount of rejection, loneliness, and feeling overlooked in society, especially when their social media is filled with idealized relationships and a lot of value placed on physical attractiveness.

These are real emotions, and many of them can relate to this. If you encountered a YouTuber that voiced out these things, you probably nod in agreement and might feel inclined to hear what else they have to say. The problem is that those frustrations only get amplified within a community of like-minded downtrodden young men.

What starts off as feelings of disappointment can quickly turn into a toxic environment—one that reinforces a victim mentality. It becomes an echo chamber, and over time, this worldview becomes more rigid and extreme. Often, you're not even aware you're being lied to. It's like a part of you has found comfort in finally hearing a voice willing to tell the truth instead of dancing around a topic.

It's nice to have a trustworthy source that's willing to spill nothing but facts; it's a comforting feeling to trust a creator. But what if we've lied to you before? Earlier in this video, we discussed TikTok's content categories, highlighting how the platform's algorithm creates the perfect For You page for each user.

One of the things we mentioned was how TikTok is capable of tracking a user's emotional state through their smartphone camera and microphone. That was just a lie. It's not a complete stretch to imagine that the platform is capable of doing such things; however, it's factually incorrect. If we hadn't pointed this out, that little kernel of misinformation could have created a myriad of reactions.

Some viewers might react with increased anxiety and paranoia, while others might begin to believe inaccurate judgments about data privacy. And on the extreme end, we might even get a few birds are CIA robot fans from the Aperture fan base, which, while entertaining, isn't really quite what we're going for.

Here's where things get philosophical. One of the most unsettling questions of the digital age is this: are we really making our own choices anymore, or are we simply following fragments of free will littered like fireflies, leading us deeper into an inescapable maze?

The death of free will has haunted our species for centuries. Sentience is a privilege that took us to the stars, but the bane of our existence lies in the arrogance of believing we're always in control. In philosophy, there's a concept that perfectly encapsulates this modern dynamic dance we have with the For You page: it's called determinism.

It's an idea that all events, including our human actions, are predetermined by pre-existing causes. Think of your For You page as a result of your past behaviors—your browsing patterns, interactions, and clicks. All those past actions are shaping the digital choices you'll make tomorrow. It's an invisible feedback loop that subtly guides your behavior, all without you knowing.

We've actually made a video on this before, and you should check that out if you'd like a deep dive into the dangers of infinite scrolling. The internet wasn't always this way. For a long time, it was a peaceful beach filled with endless waves of opportunity. Much like what we did to real beaches, the internet is no longer pristine.

The seashells of novelty are gone, and in their place, we've got a shoreline littered with digital garbage—nothing but subtle, ironic ads that blend into the landscape. It's almost poetic, being reminded of Coca-Cola's existence thanks to a plastic bottle that washed ashore. Back then, you would type random searches into Google, click on fancy new blogs, and stumble upon obscure forums and enjoy the beauty of it all.

The internet was vast, decentralized, and designed for us to get lost in the endless possibility of what we might find. No Reddit thread filled with armchair experts deciding the answers to every question. Sure, modern answers are more useful, but there was something charming about the randomness and the downright ridiculous replies people usually got to questions.

There was no SEO sorting for common sense. It was just people being their authentic, humorous, and stubborn selves. The content wasn't curated for any metrics. In a room somewhere, discovery is gone, and in its place, we have pure consumption.

The algorithm ensures that you are attended to, which sounds great in theory, but it limits your exposure to the unexpected, uncomfortable, and unfamiliar. Science has shown us that going through the unknown is when our best learning happens. It's no fun being told what's right; instead, it's nice to find solutions through trial and error—acknowledging the disappointment of failure, stepping up, and finding courage to find answers.

Journey before destination: would it have been satisfying if Frodo and the fellowship marched straight into Mount Doom without a hitch and a tale of really competent hobbits solving every problem with ease? Who would care about that? The journey, with all of its painful obstacles, is what made their story meaningful.

This raises an important question: if the internet has become so controlled and personalized, is there room left for true autonomy, or are we just passive consumers led through a digital landscape to where thoughts and choices are predetermined by the platforms we use? Nowadays, one of the clearest examples of free will's erosion is in hyper-targeted advertising.

Tech companies have perfected the art of predicting consumer behavior on a dangerous level. It's to the point where they might be able to influence purchasing decisions without the user ever realizing what's going on. Now, you might want to believe that you're choosing to buy a certain product. Meanwhile, in reality, you've been nudged in that direction for weeks, all thanks to the subtlety of tailored content.

It's not just products that are our biggest concern; algorithms are influencing the way we vote, the media we consume, and even the people we spend our lives with. In today's digital space, tech giants like Google stand at the top of a massive row of dominoes, with 92% of all traffic dedicated to Google search. It's hard to trust results provided by a single organization.

A recent court case against Google has raised serious concerns about how tech companies have the ability to manipulate users through their monopoly on information. With Google search algorithm—much like TikTok's For You page—users aren't just given the most relevant information; instead, you're given information deemed most likely to keep you engaged. Unfortunately, this isn't always in our best interests.

Where are we headed with all this? It seems like the For You page's influence reaches far beyond the fabric of entertainment; instead, it trails down to our politics, culture, and even how we perceive each other. The insidious machine has forced our society into bubbles of isolation, and we're floating with uncertainty about when they might burst.

Billions of bubbles hopping simultaneously heralding the end of liberty and a thunderous applause. Convenience is amazing; who doesn't love access to stuff with the push of a button? However, every warning about a dystopia has been cautionary about the erosion of freedom under the guise of comfort and convenience.

Stories like Aldous Huxley's "Brave New World," published in 1932, explored a world that was sent into a dystopia through the manipulation of pleasure and control instead of over-authoritarian oppression. Huxley's vision was a society that was largely pacified thanks to the spread of drugs and superficial pleasures. But on a larger scale, we're being pacified with endless and constant access to modern pleasures, shaping our preferences and perspectives without us realizing it.

Recently, ADHD diagnoses have skyrocketed considerably, no doubt thanks to the attention economy we live in. It's hard for minds to stay focused in a world of convenience and instant dopamine rushes. We've seen some seismic shifts in culture and politics since the algorithms became more advanced. Think about how quickly internet trends transition into the real world.

Take the infamous "Devious Lick" trend, where TikTok challenges led teenagers to steal or vandalize school property for virality. What started as a joke ended up costing schools thousands of dollars in damages, disrupting entire communities. It's not just harmless pranks or petty vandalism that the For You page has shaped. More serious topics in societal movements, from radical political ideologies to the manosphere, have seen some questionable ideals gain traction.

The little dark-age edits of dudes in military gear have slowly transformed into a soundtrack for neo-Nazis to create edits in approval of dangerous ideologies. This isn't the first time we've seen a harmless idea turn into an instrument for vectors of misinformation. Famous is Pepe the Frog—a once innocent cartoon character that was transformed into a symbol of hate.

Somehow, it started off as an internet meme, but now the character is forever attached to a complicated, politically charged history. It's not just Pepe, though; the lines between humor and political discourse feel completely faded. Nowadays, it's hard to distinguish a dog whistle or coded message from a regular, harmless joke. This is by design.

The For You page's focus on engagement transcends all. These are the consequences. When we look at the larger picture, the For You page's true power is clearer than ever. It's not just shaping trends and political beliefs; it's reshaping our true sense of self. With an ability to influence your identity, it slowly reinforces pre-existing biases, limiting growth in critical thinking skills.

Ultimately, we have to look in the mirror once more. Now that we can see the true mechanisms of our digital selves, we're beginning to realize that the reflection lacks a distinct element of humanity. Sure, this version of ourselves that the algorithm recognizes is brilliantly curated; it's almost indistinguishable from reality. But there's something missing—almost like a gift from our evolutionary ancestors.

The uncanny valley kicks in, and we instinctively recognize the inhuman. So, what does the algorithm get wrong? Potential. This machine has only ever understood humanity based on data it's been fed. Meanwhile, every person on earth is a culmination of an almost infinite, unfathomable amount of coincidence—crazy, odd, and downright ridiculous moments that no algorithm could ever predict.

The chaos, the randomness, is what truly defines us. And if chaos got us this far, why surrender to a machine's shallow understanding of order? As a species, the time has come to break out of these invisible cages. We have to actively seek out new perspectives, challenge our own biases, and take conscious control of our digital consumption. Only then can we avoid becoming mere products of an algorithm.

Standing at this crossroads, it's easy to feel overwhelmed by the sheer power of algorithms that are able to shape our identities. But the truth is that we've evolved beyond the control these machines are capable of establishing. We just need to tap into the confidence we found millions of years ago when we discovered new ways to live.

So, the next time you open up your For You page, remember: it's not just entertainment or a distraction; it's shaping you in ways you might not even realize. The question is, will you let it, or will you step outside the algorithmic bubble, embrace the unknown, and rediscover your true potential? The choice, as always, is yours.

More Articles

View All
Positive and negative intervals of polynomials | Polynomial graphs | Algebra 2 | Khan Academy
Let’s say that we have the polynomial p of x, and when expressed in factored form, it is (x + 2)(2x - 3)(x - 4). What we’re going to do in this video is use our knowledge of the roots of this polynomial to think about intervals where this polynomial would…
Khan Academy announces GPT-4 powered learning guide
Hi everyone, Sal Khan here from Khan Academy, and I’m very excited to let you all know about the work that Khan Academy is now doing in artificial intelligence. Obviously, over the last many months, there’s been a lot of talk about artificial intelligenc…
Why I Stopped Listening To Finance Gurus
Basically, all my money that’s in stocks and shares is invested in IND. What’s going on when it comes to Index Fund? You want to get rich from investing? F*** investing! Despite the popular financial advice of saving as much as you can and investing the …
You're Just Moments Away from Success
Are you the type of person to analyze every second of the interaction you just had with someone for hours on end, or are you normal? Either way, you probably don’t think all that hard about every single detail of the decisions you make in social situation…
Raised by Thetans in a Galactic Gulag | Aaron Smith-Levin | EP 413
Hello everyone. I’m pleased to announce my new tour for 2024, beginning in early February and running through June. Tammy and I, an assortment of special guests, are going to visit 51 cities in the US. You can find out more information about this on my we…
Would scientists tell us about a looming apocalypse? | Michelle Thaller | Big Think
Anthony, I often get asked this question: If scientists actually knew that there was an impending catastrophic collision, some asteroid was heading towards Earth, would they tell you? And the answer is yes. We actually study the sky every night; we’re loo…