It's harder than you'd think to find a system that doesn't let energy out or in — our universe is a good example of that — but entropy describes how disorder happens in a system as large as the universe or as small as a thermos full of coffee. This tendency for energy to disperse is often represented by the term Temperature *T*.

In the latter example, thermodynamic entropy increases when hot coffee is poured into a thermos. When that heat energy gradually lessens, it's actually another entropy increase -- since a heat transfer is dispersing beyond the thermos. Eventually, the thermos will reach a thermal equilibrium, whereby the temperature of the thermos will remain constant with its surrounding ambient temperature.

However, entropy doesn't have to do with the type of disorder you think of when you lock a bunch of chimpanzees in a kitchen. It has more to do with *how many possible permutations* of mess can be made in that kitchen rather than how *big* a mess is possible. Of course, the entropy depends on a lot of factors: how many chimpanzees there are, how much stuff is being stored in the kitchen and how big the kitchen is.

So, if you were to look at two kitchens — one very large and stocked to the gills but meticulously clean, and another that's smaller with less stuff in it, but pretty trashed out by chimps already — it's tempting to say the messier room has more entropy, but that's not necessarily the case. Entropy concerns itself more with *how many different states are possible* than how disordered it is at the moment; a system, therefore, has more entropy if there are more molecules and atoms in it, and if it's larger. And if there are more chimps.