The second law of thermodynamics states that the entropy of any thermally
isolated system will either increase or remain constant. At the time Rudolph
Clausius first proposed this law (see [5]), there was no known
connection between entropy and any well-understood microscopic concept.
Clausius defined changes in entropy, , in terms of macroscopic
quantities: the transfer in energy in the form of heat, , and the
temperature, *T*:

When entropy is defined in this way, the second law of thermodynamics may be stated mathematically. For an isolated system,

One important implication of this is that work cannot be extracted from systems at equilibrium. In order to for a system to do work, it must be displaced from equilibrium, or have been away from equilibrium to begin with.

This law was found to have accurate and useful consequences, but no intuitive notion of what signified was known until Ludwig Boltzmann proposed a statistical definition of entropy:

where is the number of possible microstates the macroscopic system
may be in, and *k* is Boltzmann's constant. may be seen as a
measure of the degree of randomness present in the state of the system.
Boltzmann's definition of entropy is consistent with Clausius's
understanding of entropy if heat is the change in the energy of
randomly moving particles. [9]

Mon Jun 16 13:53:44 EDT 1997