By Ethan Y. Feng
If you’ve ever taken any sort of chemistry or physics class, then you’ve probably been taught the Second Law of Thermodynamics: the entropy of the universe always increases over time. But what is entropy? You have likely heard that entropy is a measure of “chaos” or “disorder.” But such definitions are vague and unquantifiable—how would one even measure an “amount of chaos”? Furthermore, when presenting the Second Law, professors presumably just told you to accept it as fact, with no explanation of why it is true, leaving curious minds unsatisfied. What the heck is entropy, and why does it increase? Concrete answers to these questions do exist, so this article aims to explain the real definition of entropy as well as provide an intuition for why the Second Law is true.
To start towards our goal, imagine we flipped 4 fair coins. Just based on past experiences, you probably know that landing all 4 heads or all 4 tails is much less likely than an outcome with a mix of heads and tails. But why is this true, fundamentally? Each coin has a 50:50 chance of landing heads or tails, meaning that each permutation—HHHH, HHTT, THTH, etc.—is equally likely (with a probability of 0.5⁴). Now, suppose we write out every single possible outcome. If we categorize them according to the total amount of heads and tails (see below), we see that there are far more permutations that fall into the mixed categories, compared to only one in both the all-heads and all-tails categories. So, since all the permutations are equally likely, if we randomly picked one of these permutations (which is essentially what’s happening when we flip the coins), it’s far more likely that we would land within a category with a mix of heads and tails. This is why the mixed outcome is more likely.
Now, suppose we adjust our example slightly. Instead of coins, imagine a closed box containing 4 gas particles, which move around randomly. Consider the left and right halves of the box: since the particles’ motion is random, each particle is equally likely to be on the left or the right side. Essentially, left vs. right for a particle is analogous to heads vs. tails for a coin. Then, let us ask the question: is it more likely that the particles are all on one side, or that some are on the left and others on the right? To answer this, we can write out every single possible state of the box, just as we did with the coins:
As we can see, there are far more arrangements in which the particles are spread out than all on one side. Since the particles move randomly, each arrangement is equally likely; hence, simply based on chance, an arrangement in which the particles are spread out is intrinsically more likely to occur.
Here, a one-side arrangement still has a 1 in 8 chance of occurring—low but not insurmountable odds—because we used a modest number of particles. But in the real world, no volume of gas is actually composed of only four molecules. Containers or rooms have trillions upon trillions of molecules—on the order of 10²³ particles! If we imagine scaling up our example, the arrangement in which every gas molecule is on one side quickly becomes overwhelmingly less likely than a spread-out arrangement, to the point that the odds are essentially zero. For those curious, based on this model, the chance that 1 gram of air in a box randomly accumulates on one side is so low that, if you allowed this box to sit for one million times the current age of the universe, you still would not expect it to happen even once, by a very comfortable margin!
Indeed, you probably know that all of the air in the room you are sitting in will not suddenly accumulate on one side—air naturally spreads out. But the reason it spreads out is not because that is somehow more “disorderly.” Rather, assuming the gas molecules move about randomly, sheer probability dictates that it is so much more likely for the particles to spread out because there are far more permutations that achieve that state.
This is entropy! Formally, the entropy of any given state can be thought of as the number of ways in which that state can be achieved. Furthermore, entropy does not increase because of some mystical force that favors “disorder”—rather, probability alone dictates that the state with more ways of being achieved (i.e. more entropy) is exponentially more likely to occur.
An important nuance of this definition is that the Second Law is not prescriptive: unlike the way in which Newton’s laws dictate that F must always equal ma, the Second Law of Thermodynamics does not state that the entropy of the universe must necessarily increase over time. Rather, it describes a probabilistic tendency—an overwhelmingly strong tendency, but a tendency nevertheless. It allows for the possibility that entropy can in theory decrease, just like it is always technically possible to flip all heads or all tails—it is simply very unlikely. In other words, while it is overwhelmingly likely for the universe’s entropy to increase, it is not necessary.
While this statistical definition, first pioneered by physicist Ludwig Boltzmann, provides the foundation for understanding entropy, it is also important to remember the assumptions we made to get to this point: we presumed that all permutations of the particles are equally likely. But of course, in reality, this is not the case. For example, some states might be more energetically favored; or if gravity is strong, particles may prefer the bottom of the box. Nevertheless, the basic scenario we considered is an excellent first approximation, and its appeal lies in its simplicity. While more finely-tuned models exist (and are useful), our basic particle-and-box model helps us make the first leap towards grasping entropy.
Although we arrived at an explanation of the Second Law from rudimentary hypotheticals, it is an extremely powerful one. Many of the quantities and principles fundamental to modern chemistry—including enthalpy, Gibbs free energy, and more—are derived from this very principle. More than just abstract theory, these concepts are the foundation of many influential fields of research today, notably computational simulations of proteins. When modeling molecules, computational researchers must specify “rules” that tell the atoms in the simulations how to move and behave—more often than not, part of these rules consist of this definition of entropy and its derivatives. For example, many drug design groups use simulations to virtually screen hundreds of preliminary molecules for promising drug candidates, which reduces the need for researchers to physically synthesize and test the molecules, saving time and resources. Ultimately, this is thanks in part to our rigorous definition: entropy is the number of ways in which a given state can be achieved, and it increases over time simply due to probability.
I’d like to thank Tim Berkelbach, whose physical chemistry course is where I learned many of these concepts; I highly recommend it to anyone interested in the theory behind the sciences. Many thanks as well to Yi Qu, who created the incredible illustrations you see.
Leave a Reply.