yego.me
💡 Stop wasting time. Read Youtube instead of watch. Download Chrome Extension

Laplacian intuition


5m read
·Nov 11, 2024

So here I'm going to talk about the Lan laian, and the lassan is a certain operator in the same way that the Divergence, or the gradient, or the curl, or even just the derivative are operators—the things that take in some kind of function and give you another function.

So in this case, let's say we have a multivariable function, um, like f that just takes in a two-dimensional input, F of XY. You might imagine its graph as being something like this, where, you know, the input space is this XY plane here. So each of the points XY is a point here, and then the output is just given by the height of that graph.

So the leoan of f is denoted with a right-side-up triangle, and it's going to give you a new scalar-valued function of X and Y. It's going to give you a new function that takes in a two-dimensional input and just outputs a number. And it's kind of like a second derivative because the way that it's defined is that you take the Divergence of the gradient of the gradient of your function f.

So that's kind of how it's defined: the Divergence of the gradient of F. A more common notation that you might see here is to take that upside-down triangle (nabla) dot product with nabla of f. So remember, if f is a scalar-valued function, then the gradient of f gives you a vector field, a certain vector field. But the Divergence of a vector field gives you another scalar-valued function.

So this is the sense in which it's a second derivative. But let's see if we can kind of understand intuitively what this should mean. Cuz the gradient, if you remember, gives you the slope of steepest ascent, so it's a vector field in the input space of X, and each one of the vectors points in the direction that you should walk.

Such that if this graph is kind of like a hill on top of you, it tells you the direction you should go to increase your direction the most rapidly. And if that seems unfamiliar, it doesn't make sense, maybe go take a look at that video on gradients and graphs and how they relate to each other.

So with the specific graph that I have pictured here, when you have kind of the top of a hill, all of the points around it, the direction that you should walk is towards the top of that hill. Whereas, when you have kind of like the bottom, a little gully here, all of the directions you should walk to increase the value of the function most rapidly are directly away from that valley, which you might call a local minimum.

So let's let's temporarily get rid of the graph just so we can look at the gradient field here pretty clearly. And now let's think about what the Divergence is supposed to represent. So now the Divergence, and again, if this feels unfamiliar, maybe go back and take a look at the Divergence videos.

But the Divergence has you imagining that this vector field corresponds to some kind of fluid flow. So you imagine little water molecules, and at any given moment, they're moving along the vector that they're attached to. So, for example, if you had a water molecule that started off kind of here, it would start by going along that vector and then kind of follow the ones near it, and it looks like it kind of ends up in this spot.

A lot of the water molecules seem to kind of converge over there, whereas over here, the water molecules tend to go away when they're following those vectors away from this point. And when they go away like that, when you have a whole bunch of vectors kind of pointed away, that's an indication that Divergence is positive because they're diverging away.

So over here, Divergence is positive, whereas the opposite case, where all of the water molecules are kind of coming in towards a point, that's where Divergence is negative. In another area, let's say it was kind of like this center point where, you know, you have some water molecules that look like they're coming in, but other ones are going out.

At least from this picture, it doesn't seem like, you know, the ones going out are doing so at a faster rate or slower than they are here. This would be roughly zero Divergence. So now let's think about what it might mean when you take the Divergence of the gradient field of f.

So let me kind of clear up the markings I made on top of it here. Um, points of high Divergence, points where it diverges a lot here, why are those vectors pointing away? And if we pull up the graph again, the reason they're pointing away is because the direction of steepest descent is kind of uphill everywhere you are in a valley.

Whereas in the opposite circumstance, where Divergence is highly negative because points are converging towards it, why are they pointing towards it? Well, this is a gradient field, so they're pointing towards that spot because that's where anywhere around it you should walk towards that spot to go uphill.

So in other words, the Divergence of the gradient is very high at points that are kind of like minima at points where everyone around them tends to be higher. But the Divergence of the gradient is low at points that look more like maximum points, where when you evaluate the function at all of the points around that input point, they give something smaller.

So this laian operator is kind of like a measure of how much of a minimum point is this XY. It will be very positive when F evaluated at that point tends to give a smaller value than F evaluated at neighbors of that point. But it'll be very negative if, when you evaluate F at that point, it tends to be bigger than its neighbors.

And this should feel kind of analogous to the second derivative in ordinary calculus when you have some kind of graph of just a single variable function. The second derivative, you know, the second derivative of X will be low. It'll be negative at points where it kind of looks like a local maximum.

But over here, the second derivative of X would be positive at points that kind of look like a local minimum. So in that way, the Lan is sort of an analog of the second derivative for scalar-valued multivariable functions. And in the next video, I'll go through an example of that.

More Articles

View All
Trapped in Prostitution | Underworld, Inc.
Just a mile away, one prostitute works the internet from the comfort of her bedroom. The website is really helpful because I don’t always have to leave home, and it helps get my face out there. [Music] Annabella earns her living as an independent prosti…
Slow Motion of an AK-47 Underwater (Part 1) - Smarter Every Day 95
Hey it’s me Destin. This week on Smarter Every Day, I’m gonna trick you into learning science using a gun and a high speed camera. You remember the old pistols underwater video? Well this week I’m gonna do it with a better high speed camera, and a bigger …
Jeff Dean’s Lecture for YC AI
So I’m going to tell you a very not super deep into any one topic but very broad brush sense of the kinds of things we’ve been using deep learning for the kinds of systems we’ve built around making deep learning faster. This is joint work with many, many,…
This Is the Future of Medicine | Origins: The Journey of Humankind
The collective wisdom of all of humankind led to the medical advancements that made us modern. We’re attacking the things that harm us on a microscopic level. We’re finding new ways of preventing disease every day. The question is, how far can we go? What…
Sparks from Falling Water: Kelvin's Thunderstorm
So the people from the Hunger Games came to me and they asked me if I wanted to do an experiment that would be related to power generation. And strangely, there is this one idea that I’ve been thinking about for years, and now finally I have the chance to…
Channing Tatum Makes Fire | Running Wild With Bear Grylls
CHANNING TATUM: God, all these stones, man. Look at them. They’re just massive boulders. BEAR GRYLLS: Nope, it’s a dead end. So all of this area is endless, like, dead ends. You reach a cliff face or you reach a boulder you can’t get over, you try and go…