Cut and paste from here:
http://creationevolutionuniversity.com/forum/index.php?sid=dc689c528201d321fd712b8ea831c21a[adapted from essay,
A Designed Object's Entropy Must Increase for its Complexity to Increase, Part 2 ]
THERMODYNAMICS AND STATISTICAL MECHANICS BASICS
Classical Thermodynamics can trace some of its roots to the work of Carnot in 1824 during his quest to improve the efficiency of steam engines. In 1865 we have a paper by Clausius that describes his conception of entropy. I will adapt his formula here:
[tex]\Delta S=\int_{initial}^{final} \frac{\delta Q}{T}[/tex]
Where S is entropy, Q is heat, and T is temperature. Perhaps to make the formula more accessible, let us suppose we have a 1000 watt heater running for 100 seconds that contributes to the boiling of water (already at 373.2ᵒK). What is the entropy contribution due this burst of energy from the heater? First I calculate the amount of heat energy input in the water:
[tex]\Delta E = \Delta Q=(1000 watts) (100 sec)(\frac{J/sec}{watt})=100,000J[/tex]
Using Clausius' formula, and the fact the process is isothermal, I then calculate the change of entropy in the water as:
[tex]\Delta S=\int_{initial}^{final} \frac{\delta q}{T} =\frac{1}{T}\int_{initial}^{final} \delta q =\frac{1}{T} \Delta Q=\frac {100,000J}{373.2^\circ K} \approx 268 \frac{J}{^\circ K}[/tex]
So how does all this relate to Boltzmann and statistical mechanics? There was the intuition among scientists that thermodynamics could be related to classical (Newtonian) mechanics. They suspected that what we perceived as heat and temperature could be explained in terms of mechanical behaviors of large numbers of particles, specifically the statistical aspects of these behaviors, hence the name of the discipline is
statistical mechanics.
A system of particles in physical space can be described in terms of position and momentum of the particles. The state of the entire system of particles can be expressed as a location in a conceptual
Phase Space. We can slice up this conceptual phase space into a finite number of chunks because of the
Liouville Theorem. These sliced-up chunks correspond to the microstates which the system can be found in, and furthermore the probability of the system being in a given microstate is the same for each microstate (equiprobable). Boltzmann made the daring claim that taking the logarithm of the number of microstates is related to the entropy Clausius defined for thermodynamics. The modern form of Boltzmann's daring assertion is:
[tex]S=k_{B}ln\Omega[/tex]
where Ω is the number of microstates of the system, S is the entropy, and k
B is Boltzmann's constant. Using Boltzmann's forumula we can then compute the change of entropy:
[tex]\Delta S = S_{final}-S_{initial}=k_{B} \ln \Omega_{final} - k_{B} \ln \Omega_{initial}[/tex]
As I pointed out Boltzmann's equation looks hauntingly similar to Shannon's entropy formula for the special case where the microstates of a Shannon information system are equiprobable.
Around 1877 Boltzmann published his paper connecting thermodynamics to statistical mechanics. This was the major breakthrough that finally bridged the heretofore disparate fields of thermodynamics and classical mechanics.
Under certain conditions we can relate Clausius notions of entropy to Boltzmann's notions of entropy, and thus the formerly disparate fields of thermodynamics and classical mechanics are bridged. Here is how I describe symbolically the special case where Clausius's notions of entropy agrees with Boltzmann's notions of entropy:
[tex]\Delta S=\int_{initial}^{final} \! \frac{\delta Q}{T} = k_{B} \ln \Omega_{final}- k_{B} \ln \Omega_{initial}[/tex]
or equivalently
[tex]\Delta S=\int_{initial}^{final} \! \frac{\delta Q}{T} = k_{B} \ln \Omega_{final}- k_{B} \ln \Omega_{initial}[/tex]
[tex]= k_{B} \ln (\Omega_{final}-\Omega_{initial}) = k_{B} \ln \Delta \Omega[/tex]
where
[tex]\Delta \Omega = \Omega_{final}-\Omega_{initial}[/tex]
[It should be noted, I'm not sure the above equality will always hold because Boltzman's entropy is more general the Clausius' entropy.]
Now taking the boiling water example above we can now calculate the change in the number of microstates [tex]\Delta \Omega[/tex]. First we start with the equality relating Clausius entropy to Boltzmann's:
[tex]\Delta S=\int_{initial}^{final} \! \frac{\delta Q}{T} = k_{B} \ln \Delta \Omega[/tex]
divide both sides by Boltzmann's constant to essentially normalize the right hand side:
[tex]\frac{\Delta S}{k_{B}}=\frac{\int_{initial}^{final} \! \frac{\delta Q}{T}}{k_{B}} = \ln \Delta \Omega[/tex]
now using the above numbers for [tex]\Delta S[/tex] and the fact
[tex]k_{B}=1.38 \cdot 10^{-23} \frac{J}{^\circ K}[/tex]
[tex]\frac{\Delta S}{k_{B}}=\ln \Delta \Omega=\frac{268 \frac{J}{^\circ K}}{1.38 \cdot 10^{-23} \frac{J}{^\circ K}}= 1.94 \cdot 10^{25}[/tex]
exponentiating both sides of thus
[tex]\ln \Delta \Omega=1.94 \cdot 10^{25}[/tex]
we have
[tex]e^{\ln \Delta \Omega}=e^{1.94 \cdot 10^25}[/tex]
which reduces to
[tex]\Delta \Omega=e^{1.94 \cdot 10^25} = 10^{log_{10}(e)^{1.94 \cdot 10^{25}}}=8.43 \cdot 10^{24}[/tex]
NOTE:
From
Entropy Website, Boltzmann"In order to explain the fact that the calculations based on this assumption ["...that by far the largest number of possible states have the characteristic properties of the Maxwell distribution..."] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered -- and therefore very improbable -- state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state." (Final paragraph of #87, p. 443.)
That slight, innocent paragraph of a sincere man -- but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law -- that paragraph and its similar nearby words are the foundation of all dependence on "entropy is a measure of disorder". Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving 'disorder'and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn't. Boltzmann was the source and no one bothered to challenge him. Why should they?
Boltzmann's concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy's relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, "What is entropy, really?" that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was "Learn how to calculate changes in entropy. Then you will understand what entropy 'really is'."
There is no basis in physical science for interpreting entropy change as involving order and disorder.