Eric Weinstein: How 'mental sandboxes' can help humanity reach new heights | Big Think
We have to embrace the inconsistency of our own minds, not as a bug but as a feature, that we are in essence brought here by the forces of selection. We are the products of systems of selective pressures, and what they seem to do is to create the ability to run many, many different programs and often contradictory programs within the same mind.
And the question is why have we put such an extraordinary emphasis on intellectual consistency so that we are constantly alerted to the hypocrisy of others but we are seemingly blind to it in ourselves? Our mind is constructed with an architecture that allows us to run various sandboxes where we can experiment with the ideas of others without actually becoming the other.
Can we run another mind in emulation? Perhaps not as well as its original owner, but can we run that mind well enough to understand it, to empathize with it and to argue and spar with it to achieve some kind of better outcome where we are actually able to turn foes into dancing partners as we come to show that we’ve actually understood perspectives different from our own?
The biggest objection to this way of thinking is that it’s somehow a kind of a cheat. That hypocrisy is being summoned by another name. But I think this is actually incorrect. I think that we have these sandboxes, for example, so that we can fight more effectively a foe that we feel we must defeat.
So, for example, recently I talked about the importance of being able to run a jihadi sandbox in our minds if we want to understand the forces that are behind Islamic terror and its effect on what I think are relatively fragile Western sensibilities about life and death. And so if we choose not to empathize with the other, to say that so much is beyond the pale, we are probably not going to be very effective in understanding that the other does not see itself as evil. It does not see itself as an enemy that must be fought.
I don’t necessarily need to agree with it, but to demonstrate that I can’t even run the program simply for the purpose of social signaling seems the height of folly. How do we hope to become effective if we can’t guess what the other will do next? There are limits to this. We have to have a certain kind of consistency of mind.
But the idea that you can’t be capable of running a diehard rationalist, materialist, atheist program as well as a program that says perhaps I will open myself to transcendental states and, if I need to anthropomorphize those as coming from a deity, perhaps the idea is that that architecture is not what Richard Dawkins would suggest is a kind of mind virus.
But, in fact, it’s a facility that we choose to deny ourselves in our peril. What if we’re trapped on a local maximum of fitness and, in fact, we need to get to higher ground? But the idea is that the traversal of the so-called adaptive valley, where we have to make things much, much worse before they get much better, what if the idea is that cannot generally be attempted rationally, that we need a modicum of faith, of belief that we cannot reference to any sort of information set?
We could end up trapped on local maxima forever. But I think it’s really important to consider that some people may be able to traverse the adaptive valley without a belief in a deity. Some may need a temporary belief in a deity. Some may be able to reference some sort of a transcendental state and steel ourselves in order to make the journey.
But however it's accomplished, there are times when it would appear that all hope is lost, and that if we are not to end our days stuck on these local maxima of whatever we have achieved, that we have to fundamentally experiment with ways of thinking, if only temporarily, to get us to higher ground.