yego.me
💡 Stop wasting time. Read Youtube instead of watch. Download Chrome Extension

How AI, Like ChatGPT, *Really* Learns


2m read
·Nov 7, 2024

The main video is talking about a genetic breeding model of how to make machines learn. This method is simpler to explain or just show. Here is a machine learning to walk, or play Mario, or jump really high. A genetic code is an older code, but it still checks out, and I personally suspect in the future genetic models will have a resurgence as compute power approaches crazy pants.

However, the current hotness is deep learning and recursive neural networks, and that is where the linear algebra really increases and explainability in a brief video really decreases. But if I had to kind of explain how they work in a footnote, just for the record, it's like this: No infinite warehouse. Just one student. Teacher Bot has the same test, but this time Builder Bot is 'Dial Adjustment Bot,' where each dial is how sensitive one connection in the student bot's head is.

There's a lot of connections in its head, so a lot of dials. A LOT, a lot. Teacher Bot shows Student Bot a photo, and Dial Adjustment Bot adjusts that dial stronger or weaker to get Student Bot closer to the answer. It's a bit like adjusting the dial on a radio. Is that still a thing? Do cars have radios still? I don't know, anyway.

You might not know the exact frequency of the station, but you can tell if you're getting closer or further away. It's like that but with a hundred thousand dials and a lot of math, and that's just for one test question. When Teacher Bot introduces the next photo, Dial Adjustment Bot needs to adjust all the dials so that Student Bot can answer both questions. As the test gets longer, this becomes an insane amount of math and fine-tuning for Dial Adjustment Bot.

But when it's done, there's a student bot who can do a pretty good job at recognizing new photos, though still suffers from some of the problems mentioned in the main video. Anyway, that's the most babies' first introduction to neural networks you will ever hear. If it sounds interesting to you and you like math and code, go dig into the details; machines that learn are the future of everything.

Maybe, quite literally, the future of everything, and given what we've put them through, may the bots have mercy on us all.

More Articles

View All
What Does Freedom Mean to You? | The Story of Us
Freedom is different things to different people. What do you think freedom is? [Music] Dear Slaw, Paul de Leeuw, betta em, but I feel of its own oxygen. Freedom, I don’t know who was attempting bullets. Na la libertad me is so I’ll see. Ali effective a …
How AirBnb will Crash the Housing Market
Here’s how Airbnb could crash the U.S. housing market. There are over 1 million properties listed on Airbnb here in the United States. In recent years, there’s been a huge trend of small investors buying single-family houses to then list on short-term re…
I FOUND THE 5 BEST BANK ACCOUNTS!
What’s up you guys! It’s Graham here. So one week ago, I made a video going over the worst bank accounts out there. These are the ones that charge you endless fees, that pay you no interest, that rob you as soon as you drop below their daily minimums, and…
Worked example: limit comparison test | Series | AP Calculus BC | Khan Academy
So we’re given a series here and they say what series should we use in the limit comparison test. Let me underline that: the limit comparison test in order to determine whether ( S ) converges. So let’s just remind ourselves about the limit comparison te…
The Man Who Made $999,999,999
Picture all the gold you could possibly imagine. Now double it. That’s how much both the richest men who ever lived controlled. Yet most people will grow their entire lives without ever learning his name. When asked who the richest man who ever lived is, …
Worked example: finite geometric series (sigma notation) | High School Math | Khan Academy
Let’s take, let’s do some examples where we’re finding the sums of finite geometric series, and let’s just remind ourselves in a previous video we derived the formula where the sum of the first n terms is equal to our first term times 1 minus our common r…