Entropy might be the truest scientific concept that the fewest people actually understand. The concept of entropy can be very confusing — partly because there are actually different types. The Hungarian mathematician John von Neumann lamented the situation thusly: "Whoever uses the term 'entropy' in a discussion always wins since no one knows what entropy really is, so in a debate one always has the advantage."
"It is a little hard to define entropy," says Popovic. "Perhaps it is best defined as a non-negative thermodynamic property, which represents a part of energy of a system that cannot be converted into useful work. Thus, any addition of energy to a system implies that a part of the energy will be transformed into entropy, increasing the disorder in the system. Thus, entropy is a measure of disorder of a system."
But don't feel bad if you're confused: the definition can vary depending on which discipline is wielding it at the moment:
In the mid-19th century, a German physicist named Rudolph Clausius, one of the founders of the concept of thermodynamics, was working on a problem concerning efficiency in steam engines and invented the concept of entropy to help measure useless energy that cannot be converted into useful work. A couple decades later, Ludwig Boltzmann (entropy's other "founder") used the concept to explain the behavior of immense numbers of atoms: even though it is impossible to describe behavior of every particle in a glass of water, it is still possible to predict their collective behavior when they are heated using a formula for entropy.
"In the 1960s, the American physicist E.T. Jaynes, interpreted entropy as information that we miss to specify the motion of all particles in a system," says Popovic. "For example, one mole of gas consists of 6 x 1023 particles. Thus, for us, it is impossible to describe the motion of each particle, so instead we do the next best thing, by defining the gas not through the motion of each particle, but through the properties of all the particles combined: temperature, pressure, total energy. The information that we lose when we do this is referred to as entropy."
And the terrifying concept of "the heat death of the universe" wouldn't be possible without entropy. Because our universe most likely started out as a singularity — an infinitesimally small, ordered point of energy — that ballooned out, and continues expanding all the time, entropy is constantly growing in our universe because there's more space and therefore more potential states of disorder for the atoms here to adopt. Scientists have hypothesized that, long after you and I are gone, the universe will eventually reach some point of maximum disorder, at which point everything will be the same temperature, with no pockets of order (like stars and chimpanzees) to be found.
And if it happens, we'll have entropy to thank for it.