## Entropy

“Only entropy comes easy.” – Anton Chekhov

**Entropy**, plainly defined is a lack of order or predictability or gradual decline into disorder. Entropy in our world is ever increasing, with the following framework, I will explain why.

**Entropy**

Entropy is a measure of **disorder **in a system. If for instance, your room is really tidy and organized, there is little entropy. When everything is laying around everywhere, there is a lot of entropy. In other instanced entropy is used to describe the lack of predictability or order, the decline into disorder. The above framework states that over time information increases, I will argue why information is equal to entropy, how these concepts are related, and why it is increasing.

**Arrow of Time**

My argument starts with the **arrow of time**, the travel from past to future. We cannot (in most cases) predict the future, but we can look back into the past. We can take actions to affect the future, but not the past. And more practically, we can turn eggs into omelettes, not the other way around. The arrow of time defines a distinction between the past and the future, something that is observable throughout the observable universe. Over time information increases in open systems, but lets first see how entropy influences closed systems.

**Second Law of Thermodynamics**

The second law of thermodynamics states: the entropy of a closed system will (practically) never decrease into the future. If for instance, we have an ice cube in a glass, over time it will dissolve into water. To argue that entropy has increased we only need to look at the arrangement of its molecules. To arrange them to make ice cubes, there are fewer ways of doing so than making the puddle of water. But what about putting the water back into the freezer? Won’t that decrease entropy then? The answer is no, you will burn calories, the freezer turns energy into heat, and overall the entropy in the whole system will increase. Here are some more examples:

**Examples**

- A campfire – the fire and resulting warmth and ash a more dispersed (in terms of energy) than the original wood
- The Sun – now a big ball of plasma, it will one day (in the far future) expand and dissipate
- You – although your body may reduce entropy in the short term, in the long term your molecules will disperse again

**Quantum Mechanics**

Why does entropy increase? Why is there more entropy now than right after the Big Bang? Quantum mechanics is probabilistic and every quantum event, therefore, increases the disorder in the universe. Let me explain; there is no way of predicting where an electron is going to be, you only have probabilities where it might be. Therefore if you measure an electron – if you define its position – you add information. But how then is information equal to entropy?

Information is normally associated with order. For example, the tidy room can easily give you the information where your shirts are, or in which drawer your socks are. For ten different items of clothes, you will have ten points of information. Now consider the messy room, for every different item you have to remember the exact spot, there is no logical relationship between one sock and another. So if you have ten pieces of each different item of clothing, you will have 100 different points of information. Randomness or disorder therefore equal information, and when the one grows the other does too. Along the arrow of time, entropy and information in the universe increase.

**When to Use**

What does this mean for us mere humans? Should we embrace entropy and aim for as much information as possible? My answer is no. When you have a maximum amount of information, you will not necessarily have a maximum amount of meaning. Meaning is derived from a balance of order and entropy. This is why we people use models/frameworks/theories, to order information and at the same time leave room for randomness. This is where I believe we receive the most value and can learn the most.

On an ending note, I love that quantum measurement is not predictable. It means that we cannot predict the future, that all life is not determined before us. As much as we know that entropy will increase, we do not know how and where. We have the power to shape our own future and to use entropy to increase information.

“Entropy isn’t what it used to be.” – Thomas F. Shubnell

More on** Entropy**:

https://www.youtube.com/watch?v=sMb00lz-IfE – Veritasium on Entropy

https://www.youtube.com/watch?v=G5s4-Kak49o – Vsauce on Entropy

http://en.wikipedia.org/wiki/Entropy – Wikipedia on Entropy