yego.me
💡 Stop wasting time. Read Youtube instead of watch. Download Chrome Extension

Perception: Chaos and Order | Dr. Karl Friston | EP 298


3m read
·Nov 7, 2024

Processing might take a few minutes. Refresh later.

Okay, when you make progress towards a valued goal, let's say we inhabit a shared narrative and we're making progress towards our mutual stated goal. When we see ourselves making progress, we get a bit of a dopamine hit. Could you say that the fundamental reason for the positively rewarding effect of that movement forward is that as I move forward towards a goal, I decrease the entropy that still remains between me and the goal? Is even that reward, is even that movement forward readable as an entropy reduction? I mean, it's almost written into the mathematical meaning of the word.

So, if entropy just is uncertainty, and as I get close to resolving that uncertainty—getting my fruit juice, pleasing my wife, or you know, being able to watch the news—if it's an epistemic reward, it is just expected. Surprise just is the uncertainty and the closer you get, the more—um—the less uncertain you are, and all they have been suggests exactly as you say, it's dopamine.

[Music]

Hello everyone, thank you for tuning in to watch and listen. I have the great privilege today of being able to talk with Dr. Carl Friston. In addition, let's say in a signal addition to the recent conversation I had with Andrew Huberman, Dr. Carl Friston is arguably the world's most renowned neuroscientist, a professor at University College London. He is one of the world's leading authorities on brain imaging. Ninety percent of the work published in fields employing such imaging relies on methods he pioneered.

Dr. Friston is also well known for his work on many of the topics we will discuss today—work I find even more exciting, at least conceptually speaking, than his work on brain imaging. We will discuss the ideas that concepts and precepts, categories—that’s another way of thinking about it—bind free energy or entropy, the idea of computation, especially the kind of computation that's approximates brain function as hierarchical, the theory of predictive coding, and active inference.

Welcome, Dr. Friston. It's very good of you to agree to talk to me on this podcast. I'm really looking forward to it.

That's a great pleasure to be here. Thank you.

So let me start maybe by helping people understand this idea of hierarchical computation and the binding of entropy, and so if you could walk through that briefly, then I'll ask some questions if that seems appropriate?

Yeah, sure. The binding of free energy and entropy—that sounds delightfully Freudian—and I don't mean that in a sort of disparaging sense. I think that some of the tourisms and the insights of that era have now proved themselves in modern formulations of computation, information processing, sense making in the brain.

One nice link there is to think of free energy as surprise. So, one way of looking at the way that we make sense of our world—bringing explanations, concepts, categories, notions—to the table that provide the best explanation for the myriad of sensations to which we are exposed is to see that process as a process of minimizing surprise.

So binding free energy, I think, can be read very simply as minimizing surprise. But, of course, to be surprised you have to have something you predicted; you have to have a violation of predictions. So immediately you're in the game now of predictive processing—predicting what would I see if the world out there was like this—and then using the ensuing prediction errors to adjust your beliefs and update your beliefs in the service of minimizing those prediction errors or minimizing that surprise or minimizing that free energy.

And you artfully introduce the notion of hierarchy, you know, in that question, which I think speaks to another fundamental point that in making sense of the world, in making those good predictions, we have to have an internal model—sometimes called a world model—a model that can generate what I would have seen if this was the state of affairs out there.

And that notion of a generative model I think is quite key and holds the attribute of hierarchy simply in the sense that we live in a deeply structured world. Very dy...

More Articles

View All
Searching for the World’s Last Pristine Seas | Nat Geo Live
We have taken fish out of the ocean faster than they can reproduce. Ninety percent of the large fish, like the tuna and the sharks, are gone. And we killed them in the last 100 years alone. Right now about a third of the fisheries of the world have collap…
DINOSAUR SCIENCE! feat. Chris Pratt and Jack Horner
Hey, Vsauce. Michael here. What are monsters? Scary, unnatural things? Yes, but they’re more than that and we knew that back when we named them. The word monster comes from the same root word as demonstrate and demonstrative, monere, meaning to teach, to …
Why Society Peaked in 2016
In many ways, the world sucks right now. We’re more divided than we’ve ever been. There’s more chaos, war, and unrest all around the globe. Smartphones and social media that used to act as an escape have turned into digital prisons, trapping us into an en…
Picking hyperbola equation
So, we’re asked to choose the equation that can represent the hyperbola graphed below. This is the hyperbola graphed in blue, and I encourage you to pause the video and figure out which of these equations are represented by the graph here. All right, let…
Education as an investment | Careers and education | Financial Literacy | Khan Academy
At a very high level, an investment is when you’re putting, let’s say, your money now into something in the hope that in the future you’re going to get more than that amount of money back. The extra amount that you get back you would call your return on y…
Reflecting functions: examples | Transformations of functions | Algebra 2 | Khan Academy
What we’re going to do in this video is do some practice examples of exercises on Khan Academy that deal with reflections of functions. So, this first one says this is the graph of function f. Fair enough. Function g is defined as g of x is equal to f of …