Entropy: The Invisible Force That Brings Disorder to the Universe

By: Jesslyn Shields  | 
Entropy
Entropy describes how disorder happens in a system as large as the universe or as small as a thermos full of coffee. Jose A. Bernat Bacete/Getty Images

You can't easily put the toothpaste back into the tube. You can't expect molecules of steam to spontaneously migrate back together to form a ball of water. If you release a bunch of corgi puppies into a field, it's very unlikely you're going to be able to get them all back together into a crate without doing a ton of work. These are the problems associated with the Second Law of Thermodynamics, also known as the Law of Entropy. There are many thermodynamic states of entropy change to discuss, so let's dive into this mysterious term.

Advertisement

The Second Law of Thermodynamics

Thermodynamics is important to various scientific disciplines, from engineering to natural sciences to chemistry, physics and even economics. The thermodynamic concept of an isolated system is a confined space that doesn't let energy in or out of it.

The first law of thermodynamics has to do with the conservation of energy. The energy in a closed system remains constant ("energy can neither be created nor destroyed", as the Law of Conservation of Energy dictates), unless it's tampered with from the outside. However, the energy constantly changes forms — a fire can turn chemical energy from a plant into thermal and electromagnetic energy. A battery turns chemical energy into electrical energy. The world turns and energy becomes less organized.

Advertisement

"The second law of thermodynamics is called the entropy law," Marko Popovic, a postdoctoral researcher in Biothermodynamics in the School of Life Sciences at the Technical University of Munich, told us in an email. "It is one of the most important laws in nature."

Thermodynamic entropy is a measure of the disorder in a closed system. According to the second law, when entropy increases, internal energy usually rises as well. If it isn't harnessed somehow, that thermal energy gets dispersed. Because the measure of entropy is based on probabilities, it is, of course, possible for the entropy to decrease in a system on occasion, but that's statistically very unlikely.

Advertisement

The Definition of Disorder

It's harder than you'd think to find a system that doesn't let energy out or in — our universe is a good example of that — but entropy describes how disorder happens in a system as large as the universe or as small as a thermos full of coffee. This tendency for energy to disperse is often represented by the term Temperature T.

In the latter example, thermodynamic entropy increases when hot coffee is poured into a thermos. When that heat energy gradually lessens, it's actually another entropy increase -- since a heat transfer is dispersing beyond the thermos. Eventually, the thermos will reach a thermal equilibrium, whereby the temperature of the thermos will remain constant with its surrounding ambient temperature.

Advertisement

However, entropy doesn't have to do with the type of disorder you think of when you lock a bunch of chimpanzees in a kitchen. It has more to do with how many possible permutations of mess can be made in that kitchen rather than how big a mess is possible. Of course, the entropy depends on a lot of factors: how many chimpanzees there are, how much stuff is being stored in the kitchen and how big the kitchen is.

So, if you were to look at two kitchens — one very large and stocked to the gills but meticulously clean, and another that's smaller with less stuff in it, but pretty trashed out by chimps already — it's tempting to say the messier room has more entropy, but that's not necessarily the case. Entropy concerns itself more with how many different states are possible than how disordered it is at the moment; a system, therefore, has more entropy if there are more molecules and atoms in it, and if it's larger. And if there are more chimps.

Advertisement

Entropy is Confusing

Entropy might be the truest scientific concept that the fewest people actually understand. The concept of entropy can be very confusing — partly because there are actually different types. There's negative entropy, excess entropy, system entropy, total entropy, maximum entropy, and zero entropy -- just to name a few! The Hungarian mathematician John von Neumann lamented the situation thusly: "Whoever uses the term 'entropy' in a discussion always wins since no one knows what entropy really is, so in a debate one always has the advantage."

"It is a little hard to define entropy," says Popovic. "Perhaps it is best defined as a non-negative thermodynamic property, which represents a part of energy of a system that cannot be converted into useful work. Thus, any addition of energy to a system implies that a part of the energy will be transformed into entropy, increasing the disorder in the system. Thus, entropy is a measure of disorder of a system."

Advertisement

How the Word Entropy Gets Around

But don't feel bad if you're confused: the definition can vary depending on which discipline is wielding it at the moment. In the mid-19th century, a German physicist named Rudolph Clausius, one of the founders of the concept of thermodynamics, was working on a problem concerning efficiency in steam engines and invented the concept of entropy to help measure useless energy that cannot be converted into useful work.

A couple decades later, Ludwig Boltzmann (entropy's other "founder") used the concept to explain the behavior of immense numbers of atoms: even though it is impossible to describe behavior of every particle in a glass of water, it is still possible to predict their collective behavior when they are heated using a formula for entropy.

Advertisement

"In the 1960s, the American physicist E.T. Jaynes, interpreted entropy as information that we miss to specify the motion of all particles in a system," says Popovic. "For example, one mole of gas consists of 6 x 1023 particles. Thus, for us, it is impossible to describe the motion of each particle, so instead we do the next best thing, by defining the gas not through the motion of each particle, but through the properties of all the particles combined: temperature, pressure, total energy. The information that we lose when we do this is referred to as entropy."

Low Entropy is Not Always a Reversible Process

Finally, the terrifying concept of "the heat death of the universe" wouldn't be possible without entropy. Because our universe most likely started out as a singularity — an infinitesimally small, ordered point of energy — that ballooned out, and continues expanding all the time, entropy is constantly growing in our universe. After all, there's more space and therefore more potential states of disorder for the atoms here to adopt.

Scientists have hypothesized that, long after you and I are gone, the universe will eventually reach some point of maximum disorder, at which point everything will be the same temperature, with no pockets of order (like stars and chimpanzees) to be found. The whole universe will have reached the lowest rung on the absolute temperature scale, where there is minimal molecular motion at all. And if it happens, we'll have entropy to thank for it.

Advertisement

Advertisement

Loading...