yego.me
💡 Stop wasting time. Read Youtube instead of watch. Download Chrome Extension

How Physics Affects Your Emotional State | Brian Greene


8m read
·Nov 7, 2024

Let me ask you about the idea of entropy a little bit. Um, so I, it's very difficult for me to understand entropy except in relationship to something like a goal. So, so let me lay out how this might work psychologically. Um, uh, Carl Friston has been working on this; he's the world's most cited neuroscientist, and I interviewed him relatively recently. He has a notion of positive emotion that's associated with entropy reduction.

Um, we, our work has run parallel with regards to the idea of anxiety as a signal of entropy. So imagine that you have a state of mind in mind that's a goal. You just want to cross the street; that's a good simple example. Now imagine that what you're doing is comparing the state that you're in now—you’re on one side of the street—to the state that you want to be in, which is for your body to be on the other side of the street.

Then you calculate the transformations that are necessary, the energy expenditure, and the actions that are necessary to transpose the one condition into the state of the other condition. Then you could imagine there's path length between that, right? Which would be the number of operations necessary to undertake the transformation. Then you could imagine that you could assign to each of those transformations something approximating an energy and materials expenditure cost.

Then you could determine whether the advantage of being across the street—maybe it's closer to the grocery store, let's say—whether the advantages outweigh the disadvantages.

Okay, now if you observe yourself successfully taking steps that shorten the path length across the street, that produces positive emotion, and that seems to be technically true. And then if something gets in your way, or an obstacle emerges, or something unexpected happens, then that increases the path length and costs you more energy and resources, and that produces anxiety.

Now, the problem with that from an entropy perspective is it seems to make what institutes entropy dependent on the psychological nature of the target. Like I don't exactly know how to define one state as say more entropic—and maybe it doesn't make sense more entropic than another—except in relationship to like a perceived end point. I mean otherwise, I mean I guess you associate entropy with a random walk through all the different configurations that a body of material might take, um, at a certain temperature.

It's something like that, and I would say it's analogous to that but a little bit different. So what we do is we look at the space of all possible configurations of a system, whether it's a psychological system or whether it's air molecules in a box; it doesn't really matter to us the way we humans interpret that system.

We simply look at the particles that make up the system, and we divide up the space of all possible configurations into regions that from a macroscopic perspective are largely indistinguishable. Right? The air in this room, it doesn't matter to me whether that oxygen molecule is in that corner or that corner; it would be indistinguishable but in a little functionally equivalent.

But if all the air was in a little ball right over here and none was left for me to breathe, then I would certainly know the difference between that configuration of the gas and the one that I'm actually inhabiting at the moment. So they would belong to different regions of this configuration space, which I divide up into blobs that macroscopically are indistinguishable.

We simply define the entropy, in some sense, to be the volume of that region. So high entropy means there are a lot of states that more or less look the same, like the gas in this room right now. But if the gas was in a little ball, it would have lower entropy because there are far fewer rearrangements of those constituents that look the same as the ball of gas.

So it's a very straightforward mathematical exercise to enumerate the entropy of a configuration by figuring out which of the regions it belongs to, but none of that involves the psychological states that you make reference to.

Now, there may be interesting analogies, interesting poetic resonances, interesting rhyming between the things that one is interested in from a psychological perspective and from a physics perspective. But the beauty or the downfall, depending on how you look at it, of the way we define things in physics is we kind of strip away the psychological. We strip away the observer-dependent qualities; we strip away the interpretive aspects in order to just have a numerical value of entropy that we can associate to a given configuration.

Right? Well, what you're trying to do when you control a situation psychologically is to specify the— I suppose it's something like specifying the entropy—right? Because you're trying to calculate the number of states that the situation that you're in now could conceivably occupy if you undertook an appropriate, uh—what would you say? An appropriate course of action.

And as long as while you're specifying that course of action, the system maintains its desired behavior, then it's not, for example, it's not anxiety-provoking, and you can presume that your course of action is functional. And I say if that proves to be a valuable definition to acquire insight into perhaps human behavior—the psychological reasons for crossing that street as you were describing before—then that may be valuable within that environment.

The reason why we find entropy valuable as physicists is we like to be able to figure out the general way in which systems evolve over time. And when the systems are very complicated—again, be it gas in this room or the molecules inside of our heads—it's simply too complicated for us to actually do the molecule by molecule calculation of how the particles are going to move from today until tomorrow.

Instead, we have learned from the work of people like Boltzmann and Gibbs and people of that nature a long time ago. We've learned that if you take a step back and view the system as a statistical ensemble, as an average, it's much easier to figure out on average how the system will evolve over time.

Systems tend to go from low entropy to high entropy, from order toward disorder, and we can make that quite precise in the mathematical articulation. And that allows us to understand overall how systems will change through time without having to get into the detailed microscopic calculations.

Okay, so there are some implications for that. As far as I can tell, one is that time itself is a macroscopic phenomena. And then the other—the see, there are times when it seems to me—and correct me if I'm wrong—that you're moving something like a psychological frame of reference into the physical conceptualizations.

Because, for example, you described a situation where if there was a room full of air, one of the potential configurations is that all the air molecules are clustered in one corner; at least it's denser there. Now it's going to be the case that on average, the vast majority of possible configurations of molecules—in, of air molecules—in a room are going to be characterized by something approximating random dispersment.

And so that fraction of potential configurations, where there are, uh, what would you say, there's differences in average density, are going to be rare. But you, you, you did say that you use the term ordered, and I guess I'm wondering if there is a physical definition for order.

Because the configuration where there are density differences has a certain probability—it's very low, but it has a certain probability. There isn't anything necessary that marks it out as distinct from the rest of the configurations except its comparative rarity. And, but, you can't define any given configuration as differentially rare because every single configuration is equally rare.

So how does the concept of order—how do you clarify the concept of order from the perspective of pure physics? Yes, and so you're absolutely right. When you begin to delineate configurations that you describe as ordered or disordered, low entropy or high entropy, it is by virtue of seeing the group to which they belong as opposed to analyzing them as individuals on their own terms.

And when we invoke words like order and disorder, obviously those are human psychologically developed terms, and where does it come from? It comes from the following basic fact. Which is, if you have a situation that typically we humans would call ordered—for instance, if you have books on a shelf that are all alphabetical—there are very few ways that the books can meet that criterion.

In fact, if you're talking about making them alphabetical, there's only one configuration that will meet that very stringent definition of order. You could have other definitions of order, like all the blue ones are here and all the red covers are here. Then there's a few—you can mix up the blues; you can mix up the reds—but you can't mix them together. So again, you have a definition of ordered/disordered as when you can have any of those configurations at all.

So clearly, an ordered configuration is one that's harder to achieve. It's more special; it differs from the random configuration that would arise in its own right if you weren't imposing any other restrictions. And so that's why we use those words.

But you're absolutely right; those words are of human origin, and they do require—yeah, it's partly improbability and rarity. And then the, well, the emotional component seems to come in in that it's not only rare and unlikely, but it also has some degree of functional significance.

I mean, the reason that you alphabetize your books is so that you can find them. And so it's a rare configuration that has functional utility. And that's not a bad definition of order. But the problem with that from a purely physical perspective is a definition that involves some subjective element of analysis.

So that's fine; it does. And this has bothered—but I should say this has bothered physicists for a very long time. That when you invoke the notion of entropy, unlike most other laws in physics, like, you know, Einstein's equations of general relativity or Newton's equations for the motion of objects, you can write down the symbols; everybody knows exactly what they mean, and you can simply apply them and start with a given configuration and figure out definitively what it will look like later.

Entropy, and thermodynamics, and statistical mechanics—which is the area of physics that we're talking about here—is of a different character. Because, for instance, the second law of thermodynamics that speaks about the increase of entropy going from order to disorder—you know, your books are nice and alphabetized, but you pull them out, you start to put them back, and you're going to lose the alphabetical order unless you're very careful about putting the books back in.

It's more likely that you get to this disordered state where they're no longer alphabetized in the future. But that's not a law; that's a statistical tendency. It is absolutely possible for systems to violate the second law of thermodynamics; it's just highly improbable.

If I take a—I take a handful of sand and I drop it on the beach, most of the time, it's just going to splatter and move those sand particles all over the surface. But on occasion, is it possible that I drop that handful of sand and it lands in a beautiful sand castle? Statistically unlikely, probabilistically unlikely, but could it happen? Yes!

And if it did, that would be going from a disorder to an ordered state, violating the second law of thermodynamics. So that's why this law is of a different character than what we are used to in physics.

[Music]

More Articles

View All
Introduction to t statistics | Confidence intervals | AP Statistics | Khan Academy
We have already seen a situation multiple times where there is some parameter associated with the population. Maybe it’s the proportion of a population that supports a candidate; maybe it’s the mean of a population, the mean height of all the people in th…
Comparing fractions word problems
We’re told that Katie made a table to show how much time she spent on homework last week. So, we can see the different subjects and then how much she spent in terms of hours. So, on math, she spent three-fourths of an hour, reading seven-eighths of an ho…
Why "Brain Hacks" Don't Help | Understanding Creativity with David Eagleman | Big Think
David Eagleman: There are many books that exist on creativity and it’s about, “Hey, do this,” “Take a hot shower,” or “Take a long walk in nature,” “Be in a pink room,” or something. What my coauthor Anthony Brandt and I really strove to do here was to fi…
Timur | 600 - 1450 Regional and interregional interactions | World History | Khan Academy
Where we left off in the last few videos, we saw the Empire of the Mongols fragment into the various Khanates. In the East, you have the Yuan Dynasty established by Kublai Khan, and then in the West, you have the Golden Horde, the Chagatai Khanate, and th…
Michael Seibel - Startup Investor School Day 2
So just a couple of notes. If you’ve noticed, a lot—maybe all—of the presenters thus far are YC people. That’s not going to end right now. However, the rest of the course is mostly, almost exclusively, perspectives on investing from outside of YC. So, don…
The Constant Fear of Driving While Black | National Geographic
I have this a lot of police of about four times in the last sixty days. A total of five times I’ve been probably more than 20 times. It’s more times than I care to remember. But what you do know is how a very familiar feeling comes each time I’m stopped. …