mirrored file at http://SaturnianCosmology.Org/ For complete access to all the files of this collection see http://SaturnianCosmology.org/search.php ========================================================== _The Definitions of Entropy_ The Definitions of Entropy * [1]Introduction: Entropy Defined * [2]Entropy & Classical Thermodynamics * [3]Entropy & Physical Chemistry * [4]Entropy & Statistical Mechanics * [5]Entropy & Quantum Mechanics * [6]Entropy & Information Theory * [7]Is Entropy a Measure of "Disorder"? _________________________________________________________________ _Introduction: Entropy Defined_ The popular literature is littered with articles, papers, books, and various & sundry other sources, filled to overflowing with prosaic explanations of entropy. But it should be remembered that entropy, an idea born from classical thermodynamics, is a quantitative entity, and not a qualitative one. That means that entropy is not something that is fundamentally intuitive, but something that is fundamentally defined via an equation, via mathematics applied to physics. Remember in your various travails, that _entropy is what the equations define it to be_. There is no such thing as an "entropy", without an equation that defines it. Entropy was born as a state variable in classical thermodynamics. But the advent of statistical mechanics in the late 1800's created a new look for entropy. It did not take long for Claude Shannon to borrow the Boltzmann-Gibbs formulation of entropy, for use in his own work, inventing much of what we now call _information theory_. My goal here is to shwo how entropy works, in all of these cases, not as some fuzzy, ill-defined concept, but rather as a clearly defined, mathematical & physical quantity, with well understood applications. _Entropy and Classical Thermodynamics_ Classical thermodynamics developed during the 19th century, its primary architects being [8]Sadi Carnot, [9]Rudolph Clausius, [10]Benoit Claperyon, [11]James Clerk Maxwell, and [12]William Thomson (Lord Kelvin). But it was Clausius who first explicitly advanced the idea of _entropy_ (_On Different Forms of the Fundamental Equations of the Mechanical Theory of Heat_, 1865; _The Mechanical Theory of Heat_, 1867). The concept was expanded upon by Maxwell (_Theory of Heat_, Longmans, Green & Co. 1888; Dover reprint, 2001). The specific definition, which comes from Clausius, is as shown in equation 1 below. _S = Q/T_ Equation 1 In equation 1, _S_ is the _entropy_, _Q_ is the _heat content_ of the system, and _T_ is the _temperature_ of the system. At this time, the idea of a gas being made up of tiny molecules, and temperature representing their average kinetic energy, had not yet appeared. Carnot & Clausius thought of heat as a kind of fluid, a conserved quantity that moved from one system to the other. It was Thomson who seems to have been the first to explicity recognize that this could not be the case, because it was inconsistent with the manner in which mechanical work could be converted into heat. Later in the 19th century, the molecular theory became predominant, mostly due to Maxwell, Thomson and [13]Ludwig Boltzmann, but we will cover that story later. Suffice for now to point out that what they called _heat content_, we would now more commonly call the _internal heat energy_. The temperature of the system is an explicit part of this classical definition of entropy, and a system can only have "a" temperature (as opposed to several simultaneous temperatures) if it is in thermodynamic equilibrium. So, _entropy in classical thermodynamics is defined only for systems which are in thermodynamic equilibrium_. As long as the temperature is therefore a constant, it's a simple enough exercise to differentiate equation 1, and arrive at equation 2. _S = Q/T_ Equation 2 Here the symbol " " is a representation of a finite increment, so that _S_ indicates a "change" or "increment" in _S_, as in _S = S1 - S2_, where _S1_ and _S2_ are the entropies of two different equilibrium states, and likewise _Q_. If _Q_ is positive, then so is _S_, so if the internal heat energy goes up, while the temperature remains fixed, then the entropy _S_ goes up. And, if the internal heat energy _Q_ goes down (_ Q_ is a negative number), then the entropy will go down too. Clausius and the others, especially Carnot, were much interested in the ability to convert mechanical work into heat energy, and vice versa. This idea can lead us to an alternate form for equation 2, that will be useful later on. Suppose you pump energy, _U_, into a system, what happens? Part of the energy goes into the internal heat content, _Q_, making _Q_ a positive quantity, but not all of it. Some of that energy could easily be expressed as an amount of mechanical work done by the system (_ W_, such as a hot gas pushing against a piston in a car engine). So that _Q = U - W_, where _U_ is the energy input to the system, and _W_ is the part of that energy that goes into doing work. The difference between them is the amount of energy that does not participate in the work, and goes into the heat resevoir as _Q_. So a simple substitution allows equation 2 to be re-written as equation 3. _S = ( U - W)/T_ Equation 3 This alternate form of the equation works for heat taken out of a system (_ U_ is negative) or work done on a system (_ W_ is negative), just as well. So now we have a better idea of the classical relation between work, energy and entropy. Before we go on to the more advanced topic of statistical mechanics, we will take a useful moment to apply this to classical chemistry. _Entropy and Physical Chemistry_ At the same time that engineers & physicists were laying the foundations for thermodynamics, the chemists were not being left out. Classical entropy plays a role in chemical reactions, and that role is exemplified in equation 4 below. _S = ( H - F)/T_ Equation 4 Of course, this looks just like equation 3 with different letters, and so it is. Here, we are not much interested in the physicists approach of describing the state of a "static" system, as does equation 1. The real interest for the chemist, is to predict whether or not a given chemical reaction will go. In equation 4, _H_ is the _enthalpy_, and _F_ is the _free energy_ (also known as the _Gibb's free energy_). Likewise, _H_ and _F_ are incremental variations of those quantities, and _S_ is an incremental change in the entropy of the chemical system, in the event of a chemical reaction. A little algebra, leading to equation 5, will maybe make things just a little easier to see. _F = H - T S_ Equation 5 The significance of this equation is that it is the value of _F_ which tells you whether any give chemical reaction will go forward spontaneously, or whether it needs to be pumped. The enthalpy, _H_, is the heat content of the system, and so the change in enthalpy, _H_, is the change in heat content of the system. If that value is smaller than _T S_, then _F_ will be negative, and the reaction will proceed spontaneously; the _T S_ term represents the ability to do the work required to make the reaction happen. However, if _F_ is positive, such that _H_ is greater than _T S_, then the reaction will not happen spontaneously; we still need at least _F_ worth of energy to make it happen. Note that a positive free energy does not mean that the reaction will not happen, only that it will not happen _spontaneously_ in the given environment. It can still be pushed or pumped into happening by adding energy, or setting the reaction in a higher temperature environment, making _T_ larger as well as _T S_, and perhaps driving it far enough to make _F_ negative. _Entropy and Statistical Mechanics_ In the later 1800's, [14]Maxwell, [15]Ludwig Boltzmann and [16]Josiah Willard Gibbs extended the ideas of classical thermodynamics, through the new "molecular theory" of gases, into the domain we now call _statistical mechanics_. In classical thermodynamics, we deal with single extensive systems, whereas in statistical mechanics we recognize the role of the tiny constituents of the system. The temperature, for instance, of a system defines a _macrostate_, whereas the kinetic energy of each molecule in the system defines a _microstate_. The macrostate variable, temperature, is recognized as an expression of the average of the microstate variables, an average kinetic energy for the system. Hence, if the molecules of a gas move faster, they have more kinetic energy, and the temperature naturally goes up. Equation 6 below is the general form of the definition of entropy in statistical mechanics, as first derived by Boltzmann. You can see Boltzmann's own derivation in his _Lectures on Gas Theory_ (available as a Dover reprint), but more modern treatments might be easier to follow, such as _Statistical Physics_ by Gregory H. Wannier, or _The Principles of Statistical Mechanics_ by Richard C. Tolman (both also available as Dover reprints). _S = -kˇ [Pilog(Pi)]_ Equation 6 In this equation, _Pi_ is the probability that particle "_i_" will be in a given microstate, and all of the _Pi_ are evaluated for the same macrostate of the system. The symbol (an upper case Greek _sigma_) is a mathematical instruction to add up everything to the right of it. In this case, it means to add up the product of _Pi_ times _log(Pi)_ for all of the "_i_" particles. The "_k_" out in front is an arbitrary constant, which determines the units of measure of entropy, and in thermodynamics is _Boltzmann's constant_ (1.380658×10-23 Joules/Kelvin), but its value could just as easily be arbitrarily set to 1 without affecting the generality of the arguments presented here. The negative sign is there because the probability is a number between 0 and 1, so its logarithm will always be negative, so the negative sign out front cancels the negative sign induced by taking the log of a number less than 1. Contrary to the definition seen in [17]equation 1, neither the temperature nor the heat energy appear explicitly in this equation. However, the restriction that all of the microstate probabilities must be calculated for the same macrostate, assures that, as in the earlier case, the system must be in a state of thermal equlibrium. Equation 6 treats the microstate probabilities individually. However, if all of the probabilities are the same, then we can simplify equation 6 to equation 7. _S = kˇlog(N)_ Equation 7 In this simplified form, the only thing we have to worry about is "_N_", which is the total number of microstates available to the system. Be careful to note that this is _not_ the total number of particles, but rather the total number of microstates that the particles could occupy, with the constraint that all such microstate collections would show the same macrostate. _Entropy and Quantum Mechanics_ I have separated quantum mechanics from statistical mechanics, to avoid confusion, and to avoid the implication that something important may have been overlooked. However, since the two both deal intimately with statistics & probabilities, it should come as no surprise that entropy is handled by the two disciplines in much the same way. The quantum mechanical definition of entropy is identical to that given for statistical mechanics, in [18]equation 6. The only real difference is in how the probabilities are calculated. Quantum mechanics has its own, peculiar rules for doing that, but they are not relevant to the fundamental definition of entropy. As in the previous cases, the _Pi_ are microstate probabilites, and they must all be calculated for the same macrostate. _Entropy and Information Theory_ The work done, primarily by [19]Boltzmann & [20]Gibbs, on the foundations of statistical mechanics, is of profound significance that can hardly be overestimated. In the hands of [21]Clausius and his contemporarys, entropy was an important, but strictly thermodynamic property. Outside of physics, it simply had no meaning. But the mathematical foundations of statistical mechanics are applicable to any statistical system, regardless of its status as a thermodynamic system. So it is by the road of statistical mechanics, that we are able to talk about entropy in fields outside of thermodynamics, and even outside of physics _per se_. Perhaps the first major excursion of entropy into new domains, comes at the hands of [22]Claude Shannon, widely recognized as the father of modern communication & information theory (his classical 1948 paper [23]A Mathematical Theory of Communication is on the web). _S = -kˇ [Pilog(Pi)]_ Equation 8 If this looks familiar, it's not an accident. It's quite the same as [24]equation 6 above, the definition of entropy in statistical mechanics. In _A Mathematical Theory of Communication_, appendix 2, Shannon proves his Theorem 2, that this Boltzmann entropy is the only function which satisfy's the requirements for a function to measure the uncertainty in a message (where a "message" is a string of binary bits). In this case, the constant _k_ is recognized as only setting the units; it is arbitrary, and can be set equal to exactly 1 without any loss of generality (see the discussion in Shannon's paper, begining with section 6 "Choice, uncertainty and entropy"). In this case the probabilty _Pi_ is the probability for the value of a given bit (usually a binary bit, but not necessarily). In Shannon information theory, the entropy is a measure of the uncertainty over the true content of a message, but the task is complicated by the fact that successive bits in a string are not random, and therefore not mutually independent, in a real message. Also note that "information" is not a subjective quantity here, but rather an objective quantity, measured in bits. _Generalized Entropy_ So far, we have looked at entropy in its most common, and well known forms. Most ordinary applications will use one of these entropies. But, there are other forms of entropy beyond what I have shown. For instance, Brazilian Mathematician [25]Constantino Tsallis has derived a generalized form for entropy, which reduces to the Boltzmann-Gibbs entropy in our [26]equation 6, as a special case, but can also be used to describe the entropy of a system for which our equation 6 would not work (see "[27]Justifying the Tsallis Formalism"). Hungarian mathematician [28]Alfréd Rényi was able to construct the proper entropy for fractal geometries (see "[29]The world according to Rényi: thermodynamics of fractal systems""). There are others besides these, but the entropies of Tsallis & Rényi are the ones that seem to be under the most active current consideration. These generalized forms of entropy, of relatively recent origin, serve to show that "entropy" is not just an old friend that we know quite well, as in classical thermodynamics, but also a concept that is rich in new ideas & scientific directions. _Is Entropy a Measure of "Disorder"?_ Let us dispense with at least one popular myth: "_Entropy is disorder_" is a common enough assertion, but commonality does not make it right. Entropy _is not_ "disorder", although the two can be related to one another. For a good lesson on the traps and pitfalls of trying to assert what entropy is, see _Insight into entropy_ by [30]Daniel F. Styer, American Journal of Physics 68(12): 1090-1096 (December 2000). Styer uses liquid crystals to illustrate examples of increased entropy accompanying increased "order", quite impossible in the _entropy is disorder_ worldview. And also keep in mind that "order" is a subjective term, and as such it is subject to the whims of interpretation. This too mitigates against the idea that entropy and "disorder" are always the same, a fact well illustrated by Canadian physicist Doug Craigen, in his online essay "[31]Entropy, God and Evolution". The easiest answer to the question, "_What is entropy?_", is to reiterate something I said in the introduction: _Entropy is what the equations define it to be_. You can interpret those equations to come up with a prosey explanation, but remember that the prose & the equations have to match up, because the equations give a firm, mathematical definition for entropy, that just won't go away. In classical thermodynamics, the entropy of a system is the ratio of heat content to temperature ([32]equation 1), and the change in entropy represents the amount of energy input to the system which does not participate in mechanical work done by the system ([33]equation 3). In statistical mechanics, the interpretation is more general perhaps, where the entropy becomes a function of statistical probability. In that case the entropy is a measure of the probability for a givem macrostate, so that a high entropy indicates a high probability state, and a low entropy indicates a low probability state ([34]equation 6). Entropy is also sometimes confused with _complexity_, the idea being that a more complex system must have a higher entropy. In fact, that is in all liklihood the opposite of reality. A system in a highly complex state is probably far from equilibrium and in a low entropy (improbable) state, where the equilibrium state would be simpler, less complex, and higher entropy. Move on to [35]the second law of thermodynamics Go back to [36]The Collected Writings of Tim Thompson Go all the way to [37]Tim Thompson's Home Page _Page dated 19 February 2002_ setstats 1 References 1. file://localhost/www/sat/files/tim_thompson/entropy1.html#defintro 2. file://localhost/www/sat/files/tim_thompson/entropy1.html#thermo 3. file://localhost/www/sat/files/tim_thompson/entropy1.html#chem 4. file://localhost/www/sat/files/tim_thompson/entropy1.html#stat 5. file://localhost/www/sat/files/tim_thompson/entropy1.html#quant 6. file://localhost/www/sat/files/tim_thompson/entropy1.html#info 7. file://localhost/www/sat/files/tim_thompson/entropy1.html#what 8. http://www-groups.dcs.st-andrews.ac.uk/~history/Mathematicians/Carnot_Sadi.html 9. http://www-groups.dcs.st-andrews.ac.uk/~history/Mathematicians/Clausius.html 10. http://www-groups.dcs.st-andrews.ac.uk/~history/Mathematicians/Clapeyron.html 11. http://www-groups.dcs.st-and.ac.uk/~history/Mathematicians/Maxwell.html 12. http://www-groups.dcs.st-andrews.ac.uk/~history/Mathematicians/Thomson.html 13. http://www-groups.dcs.st-andrews.ac.uk/~history/Mathematicians/Boltzmann.html 14. http://www-groups.dcs.st-andrews.ac.uk/~history/Mathematicians/Maxwell.html 15. http://www-groups.dcs.st-andrews.ac.uk/~history/Mathematicians/Boltzmann.html 16. http://www-groups.dcs.st-andrews.ac.uk/~history/Mathematicians/Gibbs.html 17. file://localhost/www/sat/files/tim_thompson/entropy1.html#eq1 18. file://localhost/www/sat/files/tim_thompson/entropy1.html#eq6 19. http://www-groups.dcs.st-andrews.ac.uk/~history/Mathematicians/Boltzmann.html 20. http://www-groups.dcs.st-andrews.ac.uk/~history/Mathematicians/Gibbs.html 21. http://www-groups.dcs.st-andrews.ac.uk/~history/Mathematicians/Clausius.html 22. http://www-groups.dcs.st-andrews.ac.uk/~history/Mathematicians/Shannon.html 23. http://cm.bell-labs.com/cm/ms/what/shannonday/paper.html 24. file://localhost/www/sat/files/tim_thompson/entropy1.html#eq6 25. http://www.santafe.edu/sfi/publications/Bulletins/bulletinFall00/features/tsallis.html 26. file://localhost/www/sat/files/tim_thompson/entropy1.html#eq6 27. http://xxx.lanl.gov/abs/cond-mat/0107441 28. http://www-groups.dcs.st-andrews.ac.uk/~history/Mathematicians/Renyi.html 29. http://xxx.lanl.gov/abs/cond-mat/0108184 30. http://www.oberlin.edu/physics/dstyer/index.html 31. http://www.escape.ca/~acc/reading/evol.html 32. file://localhost/www/sat/files/tim_thompson/entropy1.html#eq1 33. file://localhost/www/sat/files/tim_thompson/entropy1.html#eq3 34. file://localhost/www/sat/files/tim_thompson/entropy1.html#eq6 35. file://localhost/www/sat/files/tim_thompson/entropy2.html 36. file://localhost/www/sat/files/tim_thompson/faqs.html 37. file://localhost/www/sat/files/tim_thompson/index.html