Symmetry of second partial derivatives
So in the last couple videos I talked about partial derivatives of multivariable functions, and here I want to talk about second partial derivatives. So, I'm going to write some kind of multivariable function. Let's say it's, um, I don't know, sin of x * y^2, s of X multiplied by y^2.
If you take the partial derivative, you have two options. Given that there's two variables, you can go one way and say what's the partial derivative, partial derivative of f with respect to X. And what you do for that, X looks like a variable as far as this direction is concerned; Y looks like a constant. So we differentiate this by saying, uh, the derivative of s of X is cosine X. You know, you're differentiating with respect to X, and then that looks like it's multiplied by a constant. So you just continue multiplying that by a constant.
But you could also go another direction. You could also say, you know, what's the partial derivative with respect to Y? And in that case, you're considering Y to be the variable. So here it looks at Y and says y^2 looks like a variable, X looks like a constant. S of X then just looks like s of a constant, which is a constant. So that'll be that constant sin of X multiplied by the derivative of y^2, which is going to be 2 * Y, 2 * Y.
These are what you might call first partial derivatives. And there's some alternate notation here. Uh, DF Dy, you could also say F and then a little subscript Y, and over here, similarly, you'd say f with a little subscript X. Now each of these two functions, these two partial derivatives that you get, are also multivariable functions. They take in two variables and they output a scalar.
So we can do something very similar where here you might then apply the partial derivative with respect to X to that partial derivative of your original function with respect to X, right? It's just like a second derivative in ordinary calculus, but this time we're doing it partial. So when you do it with respect to X, cosine X looks like cosine of a variable, the derivative of which is negative sine that variable, and y^2 here just looks like a constant, so it just stays constant at y^2.
Similarly, you could go down a different branch of options here and say what if you did your partial derivative with respect to Y of that whole function, which itself is a partial derivative with respect to X? And if you did that, then, um, y y^2 now looks like the variable, so you're going to take the derivative of that, which is 2Y, 2Y. And then what's in front of it looks like a constant as far as the variable Y is concerned, so that stays as cosine of x.
And the notation here, um, first of all, just as in single variable calculus, it's common to kind of do an abusive notation with this kind of thing and write partial squared f divided by partial X squared. Um, and this always, I don't know, when I first learned about these things, they always threw me off. Because here, this limit notation, you have the great intuition of, you know, nudging the X and nudging the F, but you kind of lose that when you do this.
But it makes sense if you think of this partial partial X as being an operator and you're just applying it twice. Um, and over here, the way that that would look, it's a little bit funny because you still have that partial squared F on top. But then on the bottom, you write partial Y, partial X, and you know I'm putting them in this order just because it's as if I wrote it that way. Right? This reflects the fact that first I did the X derivative, then I did the Y derivative.
Um, and you could do this on this side also, and this might feel tedious, but it's actually kind of worth doing for a result that we end up seeing here that, uh, that I find a little bit surprising actually. Um, so here if we go down the path of doing, in this case, like a partial derivative with respect to X, and you know you're thinking of this as being applied to your original partial derivative with respect to Y, um, it looks here, it says s of X looks like a variable, 2Y looks like a constant.
So what we end up getting is the derivative of sin of x cosine X multiplied by that 2Y. And a pretty cool thing worth pointing out here that maybe you take it for granted, maybe you think it's as surprising as I did when I first saw it, both of these turn out to be equal. Right? Even though it was a very different way that we got that, right? You first take the partial derivative with respect to X and you get cosine X y^2, which looks very different from sin X 2Y.
And then when you take the derivative with respect to Y, um, you know, you get a certain value, and when you go down the other path you also get that same value. And maybe the way that you'd write this is that you'd say, let me just copy this guy over here. And what you might say is that, um, the partial derivative of f when you do it the other way around, when instead of doing X and then Y, you do Y and then X, partial X, that these guys are equal to each other.
And that's a pretty, pretty cool result. And maybe in this case, given that the original function just looks like the product of two things, you can kind of reason through why it's the case. But what's surprising is that this turns out to be true for, I mean, not all functions. There's actually a certain criterion. Uh, there's a special theorem. It's called Schwarz's theorem, um, where if the second partial derivatives of your function are continuous at the relevant point, that's the circumstance for this being true.
But for all intents and purposes, the kind of functions you can expect to run into this is the case. This order of partial derivatives doesn't matter; truth turns out to hold, which is actually pretty cool. Um, and I'd encourage you to play around with some other functions. Just come up with any multivariable function, maybe a little bit more complicated than just multiplying two separate things there, um, and see that it's true.
And maybe try to convince yourself why it's true in certain cases. I think that would actually be a really good exercise. And just before I go, one thing I should probably mention, a bit of notation that, uh, that people will commonly use, uh, with this second partial derivative. Uh, sometimes instead of saying partial squared F partial x squared, they'll just write it as partial and then xx.
And over here this would be partial, let's see, first you did it with X then Y, so over here you do it first X and then Y. Kind of, the order of these reverses 'cause you're reading left to right, but when you do it with this, you're kind of reading right to left for how you multiply it in. Um, which would mean that this guy, let's see this guy over here, you know, he would be partial first you did the Y and then you did the X.
So those two, those two guys are just different notations for the same thing. Um, and that can make it a little bit more convenient when you don't want to write out the entire partial squared F divided by partial x squared or, uh, things like that. Um, and with that, I'll call it an end.