

Entropy is a measure of the randomness of particles and energy in a system. High entropy means high randomness. Low entropy means low randomness.
Imagine building a castle from blocks. The blocks start out in a disorganized pile. The pile has high randomness because there are so many ways you could arrange the blocks and still have just a pile. This is a high-entropy state.
Over time, you pick blocks from the pile and arrange them into turrets and walls. A castle requires a very specific setup. You can only arrange the blocks a few ways to make it. So the castle is a less random, lower entropy state.
Gradually, the messy, high-entropy pile becomes an organized, low-entropy structure. You have to spend time and energy to do this. Maintaining or increasing order always requires energy input.
Entropy explains much of how the universe works. Scientists long ago identified a basic pattern about entropy: It tends to increase over time. And this pattern appears at the tiniest scale and the galactic level.
For example, when a massive star dies, it explodes in a supernova. This event increases entropy in two ways. For one, it disrupts the star’s organized structure and flings atoms randomly into the cosmos. Entropy increases because the randomness of the star’s particles has increased.
Second, a supernova releases much of the star’s chemical energy as heat. What we feel as heat is the random jitters of tiny particles. So you can think of heat as a high-entropy form of energy.
Knowing that everything in the cosmos tends toward chaos can help predict how the universe will end.
Math recently explained an interesting pattern in the increasing entropy of shattering objects.






