yego.me
💡 Stop wasting time. Read Youtube instead of watch. Download Chrome Extension

Introduction to residuals and least squares regression


5m read
·Nov 11, 2024

So I'm interested in finding the relationship between people's height in inches and their weight in pounds. I'm randomly sampling a bunch of people, measuring their height, measuring their weight, and then for each person, I'm plotting a point that represents their height and weight combination.

For example, let's say I measure someone who is 60 inches tall; that would be 5 feet tall, and they weigh 100 pounds. So I’d go to 60 inches and then 100 pounds right over there. That point right over there is the point (60, 100). One way to think about it: height we could say is being measured on our x-axis or plotted along our x-axis, and then weight along our y-axis. This point from this person is the point (60, 100), representing 60 inches and 100 pounds.

So far, I've done it for 1, 2, 3, 4, 5, 6, 7, 8, and 9 people, and I could keep going. But even with this, I could say, well, look, it looks like there's a roughly linear relationship here. It looks like it's positive; that generally speaking, as height increases, so does weight. Maybe I could try to put a line that can approximate this trend, so let me try to do that.

This is my line tool. I could think about a bunch of lines; something like this seems like it would be wrong because most of the data is below the line, so that doesn’t seem right. I could do something like this, but that doesn't seem like a good fit either; most of the data seems to be above the line.

Once again, I'm just eyeballing it here. In the future, you will learn better methods of finding a better fit. But something like this, and I'm just eyeballing it, looks about right. So that line you could view as a regression line. We could view this as (y = mx + b), where we would have to figure out the slope and the y-intercept.

We could figure it out based on what I just drew, or we could even think of this as weight being equal to our slope times height plus whatever our y-intercept is. Or you could think of it as your weight intercept. Either way, this is the model that I'm just eyeballing; this is my regression line, something that I'm trying to fit to these points.

Clearly, one line won't be able to go through all of these points; there is going to be, for each point, some difference—well, not for all of them, but for many of them—some difference between the actual and what would have been predicted by the line. That idea, the difference between the actual for a point and what would have been predicted given, say, the height, is called a residual. I write that down: a residual for each of these data points.

For example, if I call this right here point 1, the residual for point 1 is going to be, well, for our height variable, 60 inches; the actual here is 100 pounds. From that, we would subtract what would be predicted. What would be predicted is right over here. I could just substitute 60 into this equation, so it would be (m \times 60 + b).

I could write it as (60m + b). Once again, I would just take the 60 pounds and put it into my model here and say, well, what weight would that have predicted? I could even, just for the sake of having a number here, let me get my line tool out and try to get a straight line from that point.

So from this point, let me get a straight line. That doesn't look quite straight; okay, a little bit. If I, it looks like it's about 150 pounds, so my model would have predicted 150 pounds. The residual here is going to be equal to negative 50. A negative residual is when your actual is below your predicted.

So this right over here, this is (r_1); it is a negative residual. If you had—if you tried to find, let's say, that this residual right over here for this point, this (r_2), this would be a positive residual because the actual is larger than what would have actually been predicted.

A residual is good for seeing how good your line, your regression, does your model fit a given data point or how does a given data point compare to that. But what you probably want to do is think about some combination of all the residuals and try to minimize it.

Now, you might say, well, why don't I just add up all the residuals and try to minimize that? But that gets tricky because some are positive and some are negative. A big negative residual could counterbalance a big positive residual, and it would look like they would add up to zero, and then it would look like there’s no residual. So you could just add up the absolute values.

You could say, well, let me just take the sum of all of the absolute value of all of the residuals, and then let me change (m) and (b) for my line to minimize this, and that would be a technique of trying to create a regression line. But another way to do it, and this is actually the most typical way that you will see in statistics, is that people take the sum of the squares of the residuals—the sum of the squares.

When you square something, whether it's negative or positive, it's going to be positive, so it takes care of that issue of negatives and positives canceling out with each other. When you square a number, things with large residuals are going to become even larger. Relatively speaking, you know, if you square a large… you know, one is… If you think about it this way, let me put regular numbers: 1, 2, 3, 4; these are all one apart from each other. But if I were to square them: 1, 4, 9, 16, they get further and further apart.

The larger the residual is when you square it, the sum of squares is going to represent a bigger proportion of the sum. What we’ll see in future videos is that there is a technique called least squares regression where you can find an (m) and a (b) for a given set of data so it minimizes the sum of the squares of the residual. That’s valuable, and the reason why this is used the most is it really tries to take into account things that are significant outliers that sit pretty far away from the model.

Something like this is going to really… with least squares regression, it's going to try to be minimized or it's going to be weighted a little bit heavier because when you square it, it becomes an even bigger factor in this. But this is just a conceptual introduction; in future videos, we'll do things like calculate residuals and we'll actually derive the formula for how do you figure out an (m) and a (b) for a line that actually minimizes the sum of the squares of the residuals.

More Articles

View All
AP Chemistry multiple choice sample: Boiling points
Consider the molecules represented above and the data in the table below. We have the structure up here for non, the structure for 2, 3, 4-triopentane, which is really hard to say, so I’m going to abbreviate that TFP. Um, and we have this data in the tabl…
Modern Lives, Ancient Caves | Podcast | Overheard at National Geographic
[Music] They had wanted to move out of the caves into more permanent English-built structures. The caves were only a temporary place where the first settlers arrived in. It’s the year 1681. Followers of William Penn have arrived in the New World from Engl…
How to Focus Intensely
In a world that is growing in distraction, the ability to focus is becoming increasingly rare. It’s a skill that, simultaneously, is becoming increasingly valuable. Its demand is rising while its supply is decreasing, to put it in economic terms. In this …
Dave Ramsey Reacts To My $25 Million Dollar Investment
And there’s my debt: uh, four million twenty thousand dollars. Uh, it’s all five mortgages: 30-year fixed between 2.875 and 3.625. I mean, if you’re willing to let that kind of money just evaporate, I personally don’t do anything like that, so I never tho…
Investors don’t validate your startup — users do.
You don’t need every investor to like what you’re building. You just need a few of them to believe. The reality is that no matter how great your product is, how much traction you have, investors are going to reject you, and that’s okay. In fact, it puts y…
Scott Cook - Founder and Chairman of the Executive Committee, Intuit | Khan Academy
All right, I think we’re ready to start. Anyone who wants to—anyone else wants to join us for the talk with Scott Cook, founder of Intuit? So I’ll just start. You know, for everyone here at Khan Academy who doesn’t know both Scott and Cigna Cook are, you …