## Entropy and Puppies, like a Hand and a Glove

Ah yes, the good old 2nd Law of Thermodynamics. The idea that the total disorder of a system, e.g. the universe, always increases.  Or that heat always flows from hot to cold.  It’s why coffee always gets cold, why money seems to dissipate at a casino, why time flows forward, why Murphy had a law, why cats and dogs don’t tend to clean up the house.

Ultimately, due to this rather depressing physical law, the universe will die by “heat death,” where it reaches a state of absolute zero, no more heat, no motion of particles.  Don’t worry, that’s not predicted for another 10^100 (or, a Googol) years.  But, I always wondered, is it always always the case, or can entropy decrease in certain circumstances?

Got a spare fortnight? Google “violations of the second law of thermodynamics.”  Personally, I rather like Maxwell’s idea that it is a statistical argument, not an absolute one. “Maxwell’s Demon” is that hypothetical device that funnels hot molecules in one directions and cold ones in the opposite, thereby reversing the normal flow of heat.  Could a nanotech device do that some day?  Yes, I know that there has to be energy put into the system for the device to do its work, thereby increasing the size of the system upon which the 2nd law holds.  But, even without the demon, aren’t there statistical instances of 2nd Law violation in a closed system?  Not unlike the infinitesimal probability that someone’s constituent atoms suddenly line up in such a manner that they can walk through a door (see recent blog topic), so could a system become more coherent as time moves to the future.

What about lowering temperature to the point where superconductivity occurs?  Isn’t that less random than non-superconductivity.  One might argue that the energy that it takes to become superconductive exceeds the resulting decrease in entropy.  However, I would argue that since the transition from conductive to superconductive occurs abruptly, there must be a time period, arbitrarily small, during which you would watch entropy decrease.

There are those who cite life and evolution as examples of building order out of chaos.  Sounds reasonable to me, and the arguments against the idea sound circular and defensive.  However, it all seems to net out in the end.  Take a puppy, for instance.  Evolutionary processes worked for millions of years to create the domestic dog.  Entropy-decreasing processes seem to responsible for the formation of a puppy from its original constituents, sperm and an egg.  But then the puppy spends years ripping up your carpet, chewing the legs of the furniture and ripping your favorite magazines into little pieces; in short, increasing the disorder of the universe.  Net effect?  Zero.