Quadratic approximation formula, part 2
Line things up a little bit right here. All right, so in the last video, I set up the scaffolding for the quadratic approximation, which I'm calling q of a function, an arbitrary two-variable function which I'm calling f. The uh, the form that we have right now looks like quite a lot actually. We have six different terms now. The first three were just basically stolen from the local linearization formula and written in their full abstractness. It almost makes it seem a little bit more complicated than it is.
Then these next three terms are basically the quadratic parts. We have what is basically x squared. We take it as x minus x naught squared so that we don't mess with anything previously once we plug in x equals x naught. But basically we think of this as x squared. Then this here is basically x times y, but of course we're matching each one of them with the corresponding x naught y naught. And then this term is the y squared.
The question at hand is, how do we fill in these constants, the coefficients in front of each one of these quadratic terms to make it so that this guy q hugs the graph of f as closely as possible? I showed that in the very first video, kind of what that hugging means. Now, in formulas, the goal here, I should probably state what it is that we want, is for the second partial derivatives of q. So for example, if we take the partial derivative with respect to x twice in a row, we want it to be the case that if you take that guy and you evaluate it at the point of interest, the point about which we are approximating, it should be the same as when you take the second partial derivative of f, or the corresponding second partial derivative, I should say, since there's multiple different second partial derivatives, and you evaluate it at that same point.
Of course, we want this to be true not just with the second partial derivative with respect to x twice in a row, but if we did it with the other ones, like for example, let's say we took the partial derivative first with respect to x and then with respect to y. This is called the mixed partial derivative. We want it to be the case that when we evaluate that at the point of interest, it's the same as taking the mixed partial derivative of f, you know, with respect to x and then with respect to y, and we evaluate it at that same point.
At that same point, and remember for almost all functions that you deal with, when you take this second partial derivative, you know, where we mix two of the variables, it doesn't matter the order in which you take them, right? You could take it first with respect to x, then y or you could do it first with respect to y and then with respect to x. Usually, these guys are equal. There are some functions for which this isn't true, but we're going to basically assume that we're dealing with functions where this is so. That's the only mixed partial derivative that we have to take into account.
And I'll just kind of get rid of that guy there. And then of course, the final one, final one just to have it on record here is that we want the partial derivative when we take it with respect to y two times in a row, and we evaluate that at the same point. There's kind of a lot of this is, there's a lot of writing that goes on with these things. And uh, that's just kind of par for the course when it comes to multivariable calculus.
But you take the partial derivative with respect to y at both of them and you want it to be the same value at this point. So even though there's a lot going on here, all I'm basically saying is all the second partial differential information should be the same for q as it is for f. So, let's actually go up and take a look at our function and start thinking about what its partial derivatives are, what its first and second partial derivatives are.
And to do that, let me first just kind of clear up some of the board here just to make it so we can actually start computing what this, what this uh, second partial derivative is. So let's go ahead and do it. This first, this partial derivative with respect to x twice. What we'll do is I'll take one of those out and think partial derivative with respect to x, and then on the inside, I'm going to put what the partial derivative of this entire expression with respect to x is.
Well, we just take it one term at a time. This first term here is a constant, so that goes to zero. The second term here actually has the variable x in it, and when we take its partial derivative since this is a linear term, it's just going to be that constant sitting in front of it. So it'll be that constant, which is the value of the partial derivative of f with respect to x evaluated at the point of interest, and that's just a constant. Alright, so that's there. This next term has no x's in it, so that's just going to go to zero.
This term is interesting because uh, it's got an x in it, so when we take its derivative with respect to x, that 2 comes down. So this will be 2 times a, whatever the constant a is, multiplied by x minus x naught. That's what the derivative of this component is with respect to x. Then this over here, this also has an x, but it's just showing up basically as a linear term and when we treat y as a constant since we're taking the partial derivative with respect to x, what that ends up being is b multiplied by that, what looks like a constant as far as x is concerned, y minus y naught.
Then the last term doesn't have any x's in it, so that is the first partial derivative with respect to x. And now we do it again. Now we take the partial derivative with respect to x, and I'll maybe I should actually uh, clear up even more of this guy, and now when we take the partial derivative of this expression with respect to x, f sub x of x naught y naught, that's just a constant so that goes to 0.
2 times a times x, that's going to, we take the derivative with respect to x and we're just going to get 2 times a. And this last term doesn't have any x's in it, so that also goes to 0. So conveniently when we take the second partial derivative of q with respect to x, we just get a constant. It's this constant to a, and since we want it to be the case, we want that this entire thing is equal to, well what do we want? We want it to be the second partial derivative of f, you know, both times with respect to x.
So here I'm going to use the subscript notation over here, I'm using the kind of Leibniz notation, but here just second partial derivative with respect to x, we want it to match whatever that looks like when we evaluate it at the point of interest. So what we could do to make that happen, to make sure that 2a is equal to this guy, is we set a equal to one half of that second partial derivative evaluated at the point of interest. Okay, so this is something we kind of tuck away. We remember this, this is we have solved for one of the constants.
So now let's start thinking about another one of them. I guess actually don't have to scroll off because let's say we just want to take the mixed partial derivative here where if instead of taking it with respect to x twice, we wanted to, let's see, I'll kind of erase this, we wanted to first do it with respect to x and then do it with respect to y.
Then we can just kind of edit what we have over here and we say we already took it with respect to x, so now as our second go, we're going to be taking it with respect to y. So in that case, instead of getting 2a, let's kind of figure out what it is that we get when we take the derivative of this whole guy with respect to y. Well, this looks like a constant; this here also looks like a constant since we're doing it with respect to y and no y's show up, and the partial derivative of this just ends up being b. So again, we just get a constant. This time it's b, uh, not, you know, 2.
Well, previously it was 2a, but now it's just b, and this time we want it to equal the mixed partial derivative. So instead of saying f sub x x, I'm going to say f x y, which basically says you take the partial derivative first with respect to x and then with respect to y. We want this guy to equal the value of that mixed partial derivative evaluated at that point, so that gives us another fact that means we can just basically set b equal to that, and this is another fact, another constant that we can record.
And now for c. Now for c, when we're trying to figure out what that should be, the reasoning is almost identical. It's pretty much symmetric. We did everything that we did for the case x, and instead we do it for taking the partial derivative with respect to y twice in a row.
I encourage you to do that for yourself. It'll definitely solidify everything that we're doing here because it can seem kind of like a lot in a lot of computations, but you're going to get basically the same conclusion you did for the constant a. It's going to be the case that you have the constant c is equal to one half of the second partial derivative of f with respect to y, so you're differentiating with respect to y twice evaluated at the point of interest.
So this is going to be kind of the third fact, and the way that you get to that conclusion again, it's going to be almost identical to the way that we found this one for x. Now when you plug in these values for a, b, and c, and these are constants, even though there's, you know, we've written them as formulas, they are constants. When you plug those into this full formula, you're going to get the quadratic approximation.
It'll have six separate terms: one that corresponds to the constant, two that correspond to the linear fact, and then three which correspond to the various quadratic terms. And if you want to dig into more details and kind of go through an example or two on this, I do have an article on quadratic approximations, and hopefully you can kind of step through and do some of the computations yourself as you go.
But in all of this, even though there's a lot of formulas going on, it can be pretty notationally heavy. I want you to think back to that original graphical intuition. Here, let me actually pull up the graphical intuition here. So if you're approximating a function near a specific point, the quadratic approximation looks like this curve, where if you were to chop it in any direction, it would be a parabola, but it's hugging the graph pretty closely.
So it gives us a pretty close approximation. So even though there's a lot of formulas that go on to get us that, the ultimate visual, and I think the ultimate intuition, is actually a pretty sensible one. You're just hoping to find something that hugs the function nice and closely. And with that, I will see you next video.