ENSPIRING.ai: How Physics Affects Your Emotional State - Brian Greene

ENSPIRING.ai: How Physics Affects Your Emotional State - Brian Greene

The discussion delves into the understanding of entropy through psychological and physical perspectives. The concept of entropy, particularly related to anxiety and goal setting, is examined in depth, drawing on Carl Friston's work on entropy reduction and positive emotions. The intricate relationship between a desired psychological state and entropy is explored.

From a physics standpoint, explanation revolves around how systems evolve from order to disorder, or low to high entropy. The discussion highlights the analytical approach physicists take in stripping away subjective interpretations and reducing entropy to measurable configurations. The statistical mechanics, including the works of notable figures like Boltzmann and Gibbs, help interpret system behaviors over time.

Main takeaways from the video:

💡
Psychological states can be understood through entropy, aiding in understanding human behavior and developing control strategies.
💡
entropy in physics is a statistical measure, not influenced by psychological viewpoints, focusing instead on the average behaviors of complex systems.
💡
The second law of thermodynamics suggests systems tend towards disorder, with exceptions being statistically improbable but not impossible.
Please remember to turn on the CC button to view the subtitles.

Key Vocabularies and Common Phrases:

1. entropy [ˈɛntrəpi] - (noun) - A measure of disorder or randomness in a system. In psychology, it can relate to a signal of anxiety or disorganization. - Synonyms: (disorder, chaos, randomness)

Let me ask you about the idea of entropy a little bit.

2. transformation [ˌtrænsfɔrˈmeɪʃən] - (noun) - The process of change in form, appearance, or structure. - Synonyms: (alteration, change, conversion)

And then you calculate the transformations that are necessary.

3. transpose [trænˈspoʊz] - (verb) - To transfer or switch positions or places. - Synonyms: (shift, switch, replace)

transpose the one condition into the state of the other condition.

4. configuration [kənˌfɪgjʊˈreɪʃən] - (noun) - An arrangement of elements in a particular form, figure, or combination. - Synonyms: (arrangement, layout, setup)

It's a very straightforward mathematical exercise to enumerate the entropy of a configuration.

5. algorithm [ˈælgəˌrɪðəm] - (noun) - A process or set of rules to be followed in calculations or problem-solving. - Synonyms: (procedure, formula, calculation)

You calculate the transformations that are necessary, the energy expenditure, and the actions.

6. macroscopic [ˌmækroʊˈskɒpɪk] - (adjective) - Visible to the naked eye; large-scale phenomena that encompass groups rather than individual parts. - Synonyms: (large-scale, visible, observable)

From a macroscopic perspective, are largely indistinguishable.

7. indistinguishable [ˌɪndɪˈstɪŋgwɪʃəbəl] - (adjective) - Unable to be identified as different or distinct. - Synonyms: (identical, alike, uniform)

Into regions that from a macroscopic perspective, are largely indistinguishable.

8. statistical mechanics [stəˈtɪstɪkəl məˈkænɪks] - (noun) - A branch of theoretical physics that uses probability theory to study the behavior of systems composed of many particles. - Synonyms: (theoretical physics, statistical physics)

entropy and thermodynamics and statistical mechanics, which is the area of physics that we're talking about here.

9. delineate [dɪˈlɪnieɪt] - (verb) - To describe or portray something precisely. - Synonyms: (describe, outline, define)

When you begin to delineate configurations that you describe as ordered or disordered.

10. articulation [ɑːrˌtɪkjʊˈleɪʃən] - (noun) - The expression of thoughts or ideas in a coherent verbal form. - Synonyms: (expression, formulation, communication)

We can make that quite precise in the mathematical articulation.

How Physics Affects Your Emotional State - Brian Greene

Let me ask you about the idea of entropy a little bit. So it's very difficult for me to understand entropy except in relationship to something like a goal. So let me lay out how this might work psychologically. Carl Friston has been working on this. He's the world's most cited neuroscientist, and I interviewed him relatively recently. And he has a notion of positive emotion that's associated with entropy reduction. And our work has run parallel with regards to the idea of anxiety as a signal of entropy.

So imagine that you have a state of mind in mind. That's a goal. You just want to cross the street. That's a good, simple. That's a good, simple example. Now imagine that what you're doing is comparing the state that you're in now, you're on one side of the street to the state that you want to be in, which is for your body to be on the other side of the street. And then you calculate the transformations that are necessary, the energy expenditure and the actions that are necessary to transpose the one condition into the state of the other condition.

Then you could imagine there's path length between that, right? Which would be the number of operations necessary to undertake the transformation. Then you could imagine that you could assign to each of those transformations something approximating an energy and materials expenditure cost. And then you could determine whether the advantage of being across the street, maybe it's closer to the grocery store, let's say whether the advantages outweigh the disadvantages. Okay?

Now, if you observe yourself successfully taking steps that shorten the path length across the street, that produces positive emotion, and that seems to be technically true. And then if something gets in your way or an obstacle emerges or something unexpected happens, then that increases the path length and costs you more energy and resources, and that produces anxiety.

Now, the problem with that from an entropy perspective is it seems to make what constitutes entropy dependent on the psychological nature of the target. Like, I don't exactly know how to define one state as, say, more entropic. And maybe it doesn't make sense, more entropic than another, except in relationship to a. Like a perceived endpoint. I mean, otherwise, I mean, I guess you associate entropy with a random walk through all the different configurations that a body of material might take at a certain temperature.

It's something like that, and I'd say analogous to that, but a little bit different. So what we do is we look at the space of all possible configurations of a system, whether it's a psychological system or whether it's air Molecules in a box, it doesn't really matter to us the way we humans interpret that system. We simply look at the particles that make up the system, and we divide up the space of all possible configurations into regions that from a macroscopic perspective, are largely indistinguishable. Right?

The air in this room, it doesn't matter to me whether that oxygen molecules in that corner or that corner, it would be indistinguishable, but because their function was in a little functionally equivalent. But if all the air was in a little ball right over here and none was left for me to breathe, then I would certainly know the difference between that configuration of the gas and the one that I'm actually inhabiting at the moment.

So they would belong to different regions of this configuration space, which I divide up into blobs that macroscopically are indistinguishable. And we simply define the entropy in some sense to be the volume of that region. So high entropy means there are a lot of states that more or less look the same like the gas in this room right now. But if the gas was in a little ball, would have lower entropy because there are far fewer rearrangements of those constituents that look the same as the ball of gas.

So it's a very straightforward mathematical exercise to enumerate the entropy of a configuration by figuring out which of the regions it belongs to. But none of that involves the psychological states that you make reference to. So there may be interesting analogies, interesting poetic resonances, interesting rhyming between the things that one is interested in from a psychological perspective and from a physics perspective. But the beauty or the downfall, depending how you look at it, of the way we define things in physics, we kind of strip away the psychological, we strip away the observer dependent qualities, we strip away the interpretive aspects in order to just have a numerical value of entropy that we can associate to a given configuration. Right?

Well, what you're trying to do when you control a situation psychologically is to specify the. I suppose it's something like specifying the entropy, right? Because you're trying to calculate the number of states that the situation that you're in now could conceivably occupy if you undertook an appropriate, what would you say? An appropriate course of action. And as long as while you're specifying that course of action, the system maintains its desired behavior, then it's not, for example, it's not anxiety provoking. And you can presume that your course of action is functional.

And I'm saying if that proves to be a valuable definition to acquire insight into perhaps human behavior or the psychological reasons for crossing that street, as you were describing before, then that may be valuable within that environment. The reason why we find entropy valuable as physicists is we like to be able to figure out the general way in which systems evolve over time.

And when the systems are very complicated, again, be it gas in this room or the molecules inside of our heads, it's simply too complicated for us to actually do the molecule by molecule calculation of how the particles are going to move from today until tomorrow. Instead, we have learned from the work of people like Boltzmann and Gibbs and people of that nature a long time ago. We've learned that if you take a step back and view the system as a statistical ensemble, as an average, it's much easier to figure out, on average, how the system will evolve over time.

Systems tend to go from low entropy to high entropy, from order toward disorder. And we can make that quite precise in the mathematical articulation. And that allows us to understand overall how systems will change through time without having to get into the detailed microscopic calculations. Okay, so there's some implications from that, as far as I can tell. That one is that time itself is. Is in. Is a macroscopic phenomena.

And then the other, the. See there. There's times when it seems to me, and correct me if I'm wrong, that you're moving something like a psychological frame of reference into the physical conceptualizations. Because, for example, you described a situation where if there was a room full of air, one of the potential configurations is that all the air molecules are clustered in one corner. At least it's denser.

There's. Now it's going to be the case that, on average, the vast majority of possible configurations of molecules in. Of air molecules in a room are going to be characterized by something approximating random disbursement. And so that fraction of potential configurations where there's. What would you say? There's differences in average density are going to be rare.

But you did say that you used the term ordered. And I guess I'm wondering if there is a physical definition for order, because the configuration where there's density differences has a certain probability. It's very low, but it has a certain probability. There isn't anything necessarily that marks it out as distinct from the rest of the configurations, except its comparative rarity. And.

But you can't define any given configuration as differentially rare because every single configuration is equally rare. So how does the concept of order. How do you clarify the concept of order from the perspective of pure physics. Yes. And so you're absolutely right. When you begin to delineate configurations that you describe as ordered or disordered, low entropy or high entropy, it is by virtue of seeing the group to which they belong, as opposed to analyzing them as individuals on their own terms.

And when we invoke words like order and disorder, obviously those are human psychologically developed terms. And where does it come from? It comes from the following basic fact, which is if you have a situation that typically we humans would call ordered, for instance, if you have books on a shelf that are all alphabetical, there are very few ways that the books can meet that criterion. In fact, if you're talking about making them alphabetical, there's only one configuration that will meet that very stringent definition of order.

You could have other definitions of order, like all the blue ones are here and all the red, red covers are here. Then there's a few. You can mix up the blues, you can mix up the reds, but you can't mix them together. So again, you have a definition of ordered.

Disordered is when you can have any of those configurations at all. So clearly an ordered configuration is one that's harder to achieve. It's more special. It differs from the random configuration that would arise in its own right if you weren't imposing any other restrictions. And so that's why we use those words.

But you're absolutely right, those words are of human origin, and they do require it's partly improbability and rarity. And then while the emotional component seems to come in, in that, it's not only rare and unlikely, but it also has some degree of functional significance. I mean, the reason that you alphabetize your books is so that you can find them. And so it's a rare configuration that has functional utility. And that's, and that's not a bad definition of order.

But the problem with that, from a purely physical perspective, is a definition that involves some subjective element of analysis. So that's fine, it does, and this is bothered, but I should say this has bothered physicists for a very long time, that when you invoke the notion of entropy, unlike most other laws in physics, like Einstein's equations of general relativity or Newton's equations for the motion of objects, you can write down the symbols, everybody knows exactly what they mean, and you can simply apply them and start with a given configuration and figure out definitively what it will look like later.

entropy and thermodynamics and statistical mechanics, which is the area of physics that we're talking about here is of a different character because, for instance, the second law of thermodynamics that speaks about the increase of entropy going from order to disorder. You know, your books are nice and alphabetized, but you pull them out, you start to put them back, and you're going to lose the alphabetical order. Unless you're very careful about putting the books back in. It's more likely that you get to this disordered state where they're no longer alphabetized in the future.

But that's not a law. That's a statistical tendency. It is absolutely possible for systems to violate the second law of thermodynamics. It's just highly improbable. If I take a handful of sand and I drop it on the beach, most of the time it's just going to splatter and move those sand particles all over the surface.

But on occasion, is it possible that I drop that handful of sand and it lands in a beautiful sand castle? Statistically unlikely. Probabilistically unlikely. But could it happen? Yes. And if it did, that would be going from a disordered to an ordered state, violating the second law of thermodynamics. So that's why this law is of a different character than what we are used to in physics.

Science, Technology, Inspiration, Entropy, Physics, Psychology, Jordan B Peterson