yego.me
💡 Stop wasting time. Read Youtube instead of watch. Download Chrome Extension

Perception: Chaos and Order | Dr. Karl Friston | EP 298


3m read
·Nov 7, 2024

Processing might take a few minutes. Refresh later.

Okay, when you make progress towards a valued goal, let's say we inhabit a shared narrative and we're making progress towards our mutual stated goal. When we see ourselves making progress, we get a bit of a dopamine hit. Could you say that the fundamental reason for the positively rewarding effect of that movement forward is that as I move forward towards a goal, I decrease the entropy that still remains between me and the goal? Is even that reward, is even that movement forward readable as an entropy reduction? I mean, it's almost written into the mathematical meaning of the word.

So, if entropy just is uncertainty, and as I get close to resolving that uncertainty—getting my fruit juice, pleasing my wife, or you know, being able to watch the news—if it's an epistemic reward, it is just expected. Surprise just is the uncertainty and the closer you get, the more—um—the less uncertain you are, and all they have been suggests exactly as you say, it's dopamine.

[Music]

Hello everyone, thank you for tuning in to watch and listen. I have the great privilege today of being able to talk with Dr. Carl Friston. In addition, let's say in a signal addition to the recent conversation I had with Andrew Huberman, Dr. Carl Friston is arguably the world's most renowned neuroscientist, a professor at University College London. He is one of the world's leading authorities on brain imaging. Ninety percent of the work published in fields employing such imaging relies on methods he pioneered.

Dr. Friston is also well known for his work on many of the topics we will discuss today—work I find even more exciting, at least conceptually speaking, than his work on brain imaging. We will discuss the ideas that concepts and precepts, categories—that’s another way of thinking about it—bind free energy or entropy, the idea of computation, especially the kind of computation that's approximates brain function as hierarchical, the theory of predictive coding, and active inference.

Welcome, Dr. Friston. It's very good of you to agree to talk to me on this podcast. I'm really looking forward to it.

That's a great pleasure to be here. Thank you.

So let me start maybe by helping people understand this idea of hierarchical computation and the binding of entropy, and so if you could walk through that briefly, then I'll ask some questions if that seems appropriate?

Yeah, sure. The binding of free energy and entropy—that sounds delightfully Freudian—and I don't mean that in a sort of disparaging sense. I think that some of the tourisms and the insights of that era have now proved themselves in modern formulations of computation, information processing, sense making in the brain.

One nice link there is to think of free energy as surprise. So, one way of looking at the way that we make sense of our world—bringing explanations, concepts, categories, notions—to the table that provide the best explanation for the myriad of sensations to which we are exposed is to see that process as a process of minimizing surprise.

So binding free energy, I think, can be read very simply as minimizing surprise. But, of course, to be surprised you have to have something you predicted; you have to have a violation of predictions. So immediately you're in the game now of predictive processing—predicting what would I see if the world out there was like this—and then using the ensuing prediction errors to adjust your beliefs and update your beliefs in the service of minimizing those prediction errors or minimizing that surprise or minimizing that free energy.

And you artfully introduce the notion of hierarchy, you know, in that question, which I think speaks to another fundamental point that in making sense of the world, in making those good predictions, we have to have an internal model—sometimes called a world model—a model that can generate what I would have seen if this was the state of affairs out there.

And that notion of a generative model I think is quite key and holds the attribute of hierarchy simply in the sense that we live in a deeply structured world. Very dy...

More Articles

View All
Waste Not, Want Not | The Great Human Race
Whoa! What is that? Look at the bottom of that slope. I see it! Is that an animal? Huh! It’s a baby bushbuck! Look, something attacked this. Oh, look at these marks! It definitely was! Definitely something bit it. It’s bloated. It is bloated. We might no…
Is Iran Experiencing a Revolution?
On the 16th of September 2022, groups of women gathered around the Iranian capital of Tehran and began taking off their hijabs, holding them aloft on sticks. With crowds of onlookers cheering them on, they set fire to the headscarves, chanting slogans lik…
Exceptions to the octet rule | AP Chemistry | Khan Academy
In this video, we’re going to start talking about exceptions to the octet rule, which we’ve talked about in many other videos. The octet rule is this notion that atoms tend to react in ways that they’re able to have a full outer shell; they’re able to hav…
TIL: We Have Lost 50% of Wildlife Since 1970 | Today I Learned
So one thing that really surprised me was from 1970 to 2010. You know, in 40 years, we’ve lost over half our wildlife population. In 2014, there was this study that was done, and basically what they do is look at elephants and tigers and fish and all the…
Neil deGrasse Tyson Talks Life on Mars | StarTalk
Uh, Larry Wour had a question for me. He was—he was like totally there too. Let’s find out. So, let me ask you this: um, Mars, is there a possibility that there could have been— I don’t mean microbial life, but I mean actual intelligent, like human life …
Graphing geometric sequences | Algebra 1 (TX TEKS) | Khan Academy
We’re told a sequence is defined by F of n is equal to 1⁄5 * F of n-1. So each term, whatever the value of the function is there, where the sequence is for that term, it’s 1 times the previous term for each whole number n, where n is greater than one. Th…