Entropy Is Not Disorder
    By Steve Donaldson | January 4th 2011 06:51 PM | 68 comments | Print | E-mail | Track Comments
    About Steve

    Studio art major, turned medical student, turned biomedical engineer, turned computer programmer, turned university professor, turned computer support...

    View Steve's Profile

    The Mystery of Entropy (1)

        What is entropy and what does entropy have to do with order and disorder? We know what order is. The concepts of order and disorder have been part of our consciousness since long before the notion of entropy was ever invented.

    What Order Is

        Order is having everything in its proper place, behaving in its proper manner. Disorder is the opposite. Order is trains running on time, people getting to where they need to go, and shipments arriving on schedule. Order is troops reporting to their proper posts to perform their proper duties in accordance to their commander's orders. Doing otherwise causes disorder in the ranks. Order is a well tuned machine with all its parts moving in perfect coordination with all other parts. A machine with parts not behaving as they should is a machine that is out of order.

        Order does not necessarily involve movement. Sometimes "doing the proper thing" means remaining in place, as when items are categorized and stored. Books in a library are in order when each is resting in its proper place, on the proper shelf. Likewise, cans of soup in the grocery store and files in a file cabinet are in order when each is resting in its proper place. In other words, order can be dynamic or static.

    Pop Quiz

        So what is entropy? Probably the most common answer you hear is that entropy is a kind of measure of disorder. This is misleading. Equating entropy with disorder creates unnecessary confusion in evaluating the entropy of different systems. Consider the following comparisons. Which has more entropy?
        - stack of cards in perfect order or a stack of cards in random order?
        - a Swiss watch with intricate internal workings or a sundial?
        - ten jars of water stacked neatly in a pyramid or the equivalent mass of water in the form of 10 blocks of ice flying randomly through space?
        - a living, breathing human being or a dried up corpse turning to dust?
        - the universe at the moment of the Big Bang or the universe in its present state?
    If you think of entropy as disorder, then the answers to these questions may trouble you.

    Entropy According to Classical Thermodynamics (2)

        Let's take a look at where the idea of entropy actually came from. The concept of entropy originated around the mid 19th century, from the study of heat, temperature, work and energy, known as thermodynamics. This was the era of the steam locomotive. The study of how heat could be most efficiently converted to mechanical work was of prime interest. It was understood that there was a relationship between heat and temperature. Generally speaking, the more heat you applied to an object, the hotter it got. It was also understood that heat and work represented different forms of energy and that under the right circumstances, you could convert one into the other. Furthermore, it was observed that the only time heat would spontaneously flow out of one body was when it was in contact with another, colder, body. That is, heat always flowed from hot to cold. The challenge was to find the most efficient way to harness heat flowing out of a hot reservoir toward a cold reservoir and use it to do mechanical work.

        One of the difficulties was knowing how much heat energy was stored in the hot reservoir. What was the maximum heat that you could theoretically withdraw from the reservoir? You couldn't measure the heat content directly. What you could measure was the reservoir's temperature. If you knew the relationship between the temperature and the heat content for that reservoir, you could use the temperature to calculate the heat content. Furthermore, if you used a temperature scale that decreased to zero as the heat content decreased to zero, then the relationship between temperature and heat content could be represented as a simple ratio. This became the operational definition of a newly conceived property of systems, a property which came to be know as entropy. (The term was coined in 1865 by Rudolf Clausius who thought of it as representing a kind of "internal work of transformation".)  Simply stated, entropy is the relationship between the temperature of a body and its heat content (more precisely, its kinetic heat energy). Entropy, S, is the heat content, Q, divided by the body's temperature, T.
                    S = Q/T
    Stated another way, the heat, Q, stored in an object at temperature, T, is its entropy, S, multiplied by its temperature, T.
                    Q = T x S

        That is it. The definition of entropy, as originally conceived in classical thermodynamics, had nothing to do with order or disorder. It had everything to do with how much heat energy was stored or trapped in a body at a given temperature. Think of it this way. If you removed all the heat energy possible from an object by cooling it down as far as possible (down to absolute zero), and then kept track of the heat you had to put back into it to bring it back to a given state, that amount of heat supplied divided by the final temperature in kelvin would be the entropy of that object in that state. The entropy of system is the average heat capacity of the system averaged over its absolute temperature.

    The Significance of Entropy in Classical Thermodynamics

        The significance of entropy in the study of heat engines and chemical reactions is that, for a given temperature, a system can hold only a certain amount of heat energy - no more and no less - depending on the entropy of the system. If the entropy of the system changes, some energy will be released or absorbed in one form or another (like a sponge that suddenly changes how much liquid it can hold). For heat engines that meant that if you wanted to convert heat into mechanical work, you needed to make sure that more heat flowed out of the hot reservoir than could "fit" into the cold reservoir. You did this by not letting the cold reservoir heat up as heat flowed in and by not letting the hot reservoir cool down as heat flowed out. As long as you maintained a temperature difference, more heat would flow out of the hot body than could be absorbed by, "fit into", the cold body. The surplus heat flow could be used to do mechanical work.

        In chemistry entropy meant that calculating the change in chemical energy, the energy represented by the making and breaking of chemical bonds, was not enough to predict how much useful energy would be released during a reaction. The amount of energy "freed" by a reaction was the energy generated by the chemical reaction minus any additional energy trapped by changes in the system's entropy. The additional energy trapped was just the change in entropy, delta S, times the temperature of the system, T. In 1876, J. Willard Gibbs named this useful energy released as "free energy" and provided the formula to calculate it. The free energy, delta G, was the change in chemical energy, delta H, minus the trapped, thermal energy, T times delta S.
                delta G = delta H - (T x delta S)

    Entropy According to Statistical Thermodynamics

        So where then did the association between entropy and disorder come from? With time, more was learned about the role of molecules in determining the classical thermodynamic variables such as pressure, temperature, and heat. Pressure, it turned out, was just the total force exerted by individual molecules, colliding with themselves and the walls of the container, averaged over the surface area of the container. Temperature was determined to be the average kinetic energy of all the different ways the molecules could move, tumble or vibrate. This more detailed, molecular, perspective of thermodynamics and the mathematics associated with it became known as statistical thermodynamics.

        The person most responsible for working out the mathematical relationship between entropy and molecular movement was Ludwig Boltzmann. From the molecular description of heat content and temperature, Boltzmann showed that entropy must represent the total number of different ways the molecules could move, tumble or vibrate. The idea was that heat was just kinetic energy on a scale that could not be observed directly but that manifested itself in the aggregate as the thermodynamic properties that could be observed. Heat flowed from a hot body to a cold body as kinetic energy was transferred through molecular collisions occurring at the boundary between the two bodies and further distributed throughout the body as molecules collided with each other within the body. At each collision, kinetic energy was exchanged. On average, molecules with more kinetic energy lost kinetic energy as they collided and molecules with less kinetic gained kinetic energy as they collided, until, on average, the kinetic energy was optimally distributed among all the molecules and their various modes of movement.

        The net result was that the more ways a system could move internally, the more molecular kinetic energy the system could hold for a given temperature. This was because temperature was just the average kinetic energy per mode of movement.  You could think of these modes of movements as "pockets" that can hold kinetic energy. (You could also think of them in more technical terms as molecular oscillators or modes of thermal oscillation.) If each pocket, on average, could hold the same amount of kinetic energy, then the more pockets a system had, the more total kinetic energy the system contained. The greater the number of kinetic energy pockets a system had, the greater its entropy. So, on the molecular level, entropy was just a measure of the total number of molecular kinetic energy pockets contained in the system.

    Entropy As Disorder

        It was Boltzmann who advocated the idea that entropy was related to disorder. In Boltzmann's mind, the more ways a system could move internally, the more disorderly the system was. A system in "perfect order" was one in which all the molecules were locked in perfect array without any freedom of movement whatsoever. A dynamic system in perfect equilibrium represented, according to statistical thermodynamics, a system in "perfect disorder". The idea of entropy as a measure of disorder was embraced and perpetuated by his colleagues in the field of statistical thermodynamics.

    Problems With Entropy As Disorder

        But is disorder really the best word to use to define entropy? I don't think so. There are several problems with using disorder to define entropy. The first problem has to do with systems having multiple levels of organization. A system might be more or less "orderly" on one level and not at all on another. Take the example of the ice cubes flying around in space. On the level of the ice cubes, the system is disorderly, but on the molecular level, the ice molecules are locked in place, neatly in order.

        There are two ways to deal with this ambiguity. One is to limit the application of the term to only one clearly specified level at a time. In doing so, we need to be careful as to what significance we attribute to entropy at the higher levels. These "higher entropies" cannot be taken as the total entropy of the system.

        The other solution would be to reduce the whole system to its most fundamental level. The problem with this approach is knowing what is the most fundamental level of organization. At the time of Bolzmann and Clausius, molecules and atoms were considered to be the most fundamental level of organization. Now of course we know atoms have their own internal structure and even protons and neutrons have internal structure. So it gets very complicated to apply the statistical definition of entropy to any level of organization other than the original molecular level for which it was intended.

        The second problem with disorder as a definition for entropy, in my mind, even on the molecular level, is that disorder implies things are not where they should be. This is not the case. Movement on the molecular level is still governed by Newtonian mechanics. If this were not the case, the equations correlating molecular movement with the observable variables of classical thermodynamics, such as temperature and pressure, could not have been derived as they were. The molecules are, in fact, exactly where they should be. Where else could they be? They are not free to make any random turn or jump between collisions. The rules are clear - continue straight between collisions and then strictly obey the laws of conservation of energy and conservation of momentum during the collisions.

        Even if we limit ourselves to observable order, a system with high entropy can also have a high degree of order. Order depends not on how much movement there is in a system or the complexity of that movement, but on what significance the system's movement, or non-movement, has in the eye of the observer. If we could observe the individual sequence of moves of each molecule in a system and if a particular sequence had particular significance, for instance because it lead to a kind of replication or evolution, then we might perceive that combination of moves as having more order than some other combination.

        Entropy should not and does not depend on our perception of order in the system. The amount of heat a system holds for a given temperature does not change depending on our perception of order. Entropy, like pressure and temperature is an independent thermodynamic property of the system that does not depend on our observation.

    Entropy As Diversity

        A better word that captures the essence of entropy on the molecular level is diversity. Entropy represents the diversity of internal movement of a system. The greater the diversity of movement on the molecular level, the greater the entropy of the system. Order, on the other hand, may be simple or complex. A living system is complex. A living system has a high degree of order AND an high degree of entropy. A raccoon has more entropy than a rock. A living, breathing human being, more than a dried up corpse.

    Answers to Pop Quiz

        With this clearer understanding of entropy, let's take a look at those troubling entropy questions posed earlier. Those stacks of cards? They both have the same entropy. On the molecular level, the molecules are not behaving any differently in one stack than in the other. Even on the card level, there is no difference. None of the cards are moving. There is no kinetic energy present on the card level in either stack. There is no difference between the stacks except our subjective sense of order.

        As for the watch and the sundial, it depends. If they are both made of similar metals and they are at the same temperature and pressure, then on a molecular level they would have about the same entropy. The molecules in the watch would have about the same diversity of movement in the solid metal parts as the molecules in the metal of the sundial. Ounce for ounce, the heat content would be about the same for both.

        On the higher system level, you could say the watch has more entropy than the sundial because it has a greater diversity of internal movement. The watch has more internal kinetic energy than the sundial. What significance you could give this "higher level" entropy is not clear to me.

        The water in the stacked jars has more entropy than the flying ice cubes because liquid water molecules have more modes of movement than ice molecules. Again, the heat trapped in the liquid water per degree is greater than the heat trapped in the ice per degree. Certainly, the ice cubes have more kinetic energy observable on the macro scale and so could be assigned a kind of macro entropy, but what would that mean really? You could also calculate a kind of macro temperature along the same lines, as the average kinetic energy of the flying ice cubes, but why bother?

    The Big Picture

        So that brings us to the universe as a whole. This is very problematic. At the time of the Big Bang, there were no molecules. Is it really appropriate to talk about entropy, temperature and heat at this level? Does undifferentiated plasma have kinetic energy? What about the universe today? What is the temperature of the universe? What is the temperature of any system that is not homogeneous and not at thermal equilibrium? These are not trivial questions. The temperature and entropy of a system is only well defined for systems that are homogeneous and in thermal equilibrium. The easier way to answer the entropy of the universe question is to accept the 2nd law of thermodynamics and extrapolate backwards. The 2nd law says entropy is always increasing in the universe, so the entropy of the universe at the time of the Big Bang must have been much less that the entropy of the universe now.

        This does not mean there was more structure or order back then. It does mean there was less diversity and less space to move around. The evolution of the universe has been characterized by an on-going transformation from a simple, restricted, highly condensed, homogeneous state to an increasingly complex, widely dispersed, dynamic, multipotent, granular diversity. In other words, the universe is not winding down, like a giant soulless machine slowly running out of steam. On the contrary, she is just waking up.

    Update: 11/16/2012

    1. I would like to thank all the readers who have responded positively to this article. I am pleased if I have succeeded in bringing you a little clearer understanding on the subject of entropy. I am also pleased to have found that I am not the only one trying to dispel the notion that entropy is disorder. Since first posting this article in January of 2011, I have discovered a collection of articles online by someone who has been arguing this very point far longer and with greater expertise than I. The author's name is Frank L. Lambert and he is a Professor Emeritus of Chemistry at Occidental College. You can find his articles on his web site at For those of you seeking further explanation as to why shuffled decks and flying ice cubes do not represent higher entropy, I especially recommend Professor Lambert's Shuffled Cards, Messy Desks and Entropy is not "Disorder" .
    2. In my description of thermodynamic entropy I state that entropy is "the heat content, Q, divided by the body's temperature, T." This is not quite accurate. Entropy is certainly closely related to the total Q divided by the final T (see Frank Lambert and Harvey Leff, Correlation of Standard Entropy with Enthalpy ). However, the latter quantity, more accurately called the temperature averaged heat capacity, is not mathematically the same as entropy. Thermodynamic entropy is mathematically defined according to the Clausius relationship:

      The temperature averaged heat capacity, on the other hand, is mathematically defined as:

      where C(T) is the heat capacity as a function of temperature, T. The difference between the two expressions is in the placement of 1/T inside the integral as opposed to outside the integral. For constant temperature processes, 1/T is constant and can be moved outside the integral without affecting the computation. Thus for constant temperature processes such as phase changes, the melting of ice for example, the change in entropy is indeed the same as the change in the average heat capacity. However, the standard entropies, S0, one finds tabulated in physics and chemistry textbooks are not temperature averaged heat capacities, since they are calculated by integrating starting a 0 K up to the specified temperature. As Frank Lambert and Harvey Leff show, there is a strong correlation between the two quantities, but they are not the same thing. Unfortunately, thermodynamic entropy as defined by the above integral does not represent a simple, easily identifiable macro property of a substance. This is one reason the concept of entropy is so hard to teach. The closely related property of averaged heat capacity is much more intuitive and can be used as a stand-in for entropy in making the argument that entropy is not disorder, without invalidating the logic of the argument. Nevertheless, in hindsight, perhaps it would be better for me to borrow Professor Lambert's language and refer to thermodynamic entropy as "an index" of the averaged heat capacity rather than conflate the two concepts as one and the same.


    S = - kB SUMi [Pi ln(Pi)]
    ... the most general interpretation of entropy is as a measure of our uncertainty about a system
    Sascha, you are absolutely correct with that formula. The Pi would be the probability of any of the "moving parts" being in a particular "place". So the formula, roughly speaking, is a measure of how many "moving parts" a system has and how many "ways each part can move". This is what I was describing as "diversity of movement".

    This general formula, in fact, addresses one of the two objections I raised above. It includes all levels of organization of a system, by reducing everything to the most fundamental level. We are no longer just talking about how molecules move, but all the possible states a system can have, including different states within the molecule and within the atoms that make up the molecule. Determining how many different states are possible below the level of molecules and atoms is problematic. This is not something that can be measured with a calorimeter. (Likewise, large movements, including planetary movement and galactic movement and upward, would have to be included as possible states.)

    I like the fact that you did not use the words, "order" or "disorder", in your definition. I do believe there is movement in that direction in the scientific community. I think it will be harder to dislodge this "entropy is disorder" meme from the minds of the general public.

    However, I am also concerned by the use of the term, "uncertainty". Entropy, like temperature and pressure, are supposed to represent a property of the system, not the observer. If we say entropy is a measure of OUR uncertainty, then we are saying we are measuring something about us, not the system. (What would that say about the entropy of the universe before man arrived or after man is extinct?)

    We could try refining this definition from "OUR uncertainty about a system" to "THE uncertainty about the system". That would remove the observer problem, but brings us to my second objection I raised above, which was that entropy on the molecular level is not about uncertainty (nor order) at all. It is not about how molecules are "free" to move. It is not about freedom in the sense most people understand freedom. It is not about how molecules COULD move. It is about how molecules DO move. In other words, it is not about PROBABILITY. It is about FREQUENCY. (That could be a topic for another time - Does probability exist in nature or is it all just frequency?)
    Citizen Philosopher / Science Tutor
    I have read your position now and give my immediate response.
    Thankyou for making this subject entirely clear and accessible.

    The specific heat which is the basis for the definition for Entropy is the key for me. On that basis i perceive a bias in Boltzman's interpretation toward a cultural or even religious notion of Order. I believe that order , Chaos and randomness are ill defined conceptions and not relevant to the issue of available energy or even the state of matter.

    On reflection i think that the dynamic specific heat would be best served by being defined by Schroedinger's wave equation, or Dirac's Twistor Equation, or on a more mundane level the good old ideas of degrees of freedom!
    The uncountable nature of the concepts of order,,chaos, and randomness make them unsuitable for a measure. The notion of complex is also uncountable, but is a more suitable notion of what is being observed.

    I would expunge the notion of order from the notion of specific heat, concentrate on the kinetic motion description and characterise state by particulate relative position or even more generally by regional relative position. This makes the Dirac and Schroedinger formulations so relevant.

    Concentrating on degrees of freedom we may utilise orthogonal axes plus rotational axes with rotational direction. These can form a basis for a probability distribution dependent on how many degrees of freedom we wish to model. This implies the possibility of using a Lagrangian and or a Laplacian form . In any case we need a measure of possibility or potential, not disorder.

    The second law now becomes: the complexity in the universe tends to a maximum.

    We could also state: the degrees of freedom in the universe tends to a maximum.

    None of which implies disorder and one implies a complexification of order! Bot however do imply that all useful energy is being soaked up out of a system.

    I don't agree on your comment.
    Statistical physics has tought us that how molecules DO move, is a statistical consequence of the amount of possiblilities of how molecules COULD move. Everything related to entropy, is related to the amount of ossibilities. The more possibilities a system has, the greater the entropy. If we state that 'more possibilities' equals 'more uncertainty', then entropy can very well be a measure for uncertainty of a system.

    Thank you, Mathieu, for your comment. I will say I am more sympathetic to the use of the word "possibility" in referring to entropy, if by possibilities, we mean the number of ways the "parts can move" (states a system can have). The possibilities are determined by the design or structure of the system. I am less sympathetic to the use of the word "uncertainty". The two are not the same.

    Consider two Swiss watches, each keeping perfectly good time, but not necessarily having the same internal designs. Let's say we open watch A and inspect its design. From the design we can tell how each part can move, even if it is not running at the moment. We can see that this part can move this way or that way, and some other part can move some other way. There is no uncertainty here. Now, if we let it run and watch the movements, we can see exactly what state the watch is in at each moment. Again, no uncertainty. More kinetic energy, yes, but uncertainty, no.

    Now consider the second watch. We are very uncertain about its internal workings. We do not know where the parts are or how they move. Watch B may or may not have the same internal design as watch A. If they have the same design, they should have the same entropy, regardless of our uncertainty. If they are designed differently, if they have a different internal structure, then they may well have different entropy. This does not depend on our certainty or uncertainty of the internal states of each. Internally, the workings of the two watches are not uncertain.

    From a philosophical perspective, it is also interesting to consider how the notions of possibility and uncertainty might relate to the universe as a whole. Equating increasing entropy with increasing uncertainty would suggest that as the universe moves toward perfect equilibrium, the state of the universe will be more and more uncertain. That sounds like a contradiction. Are we increasingly certain or uncertain about the final state of the universe? Equating increasing entropy with increasing possibilities paints a different picture. It suggests that the evolution of the universe is one in which the possibilities are forever increasing.
    Citizen Philosopher / Science Tutor
    I agree with Feij. Boltzman defined the distinction in terms of possible states of a system. This article, as presented, contains a number of misunderstandings of physics. I stopped counting after 4.

    <!--[if gte mso 9]>


    <![endif]-->Misuses of “entropy” stem from applying it to static and artificially isolated objects. For example, a clean room may appear to be of low entropy, however, the physical processes and concomitant heat losses involved in achieving and maintaining a clean room far exceed the net entropy gain in leaving the room to decay into a more relaxed state of disorder.

    The confusion is a bit like looking at an electric car in isolation and saying it has zero emissions. One needs to take a wider look and consider the thousand-and-one energy conversions involved to make the car, maintain it, dispose of it and also to provide for the electricity and infrastructure that it uses. Each conversion is far less than 100% efficient. The original studies on thermodynamics and heat engines were motivated in trying to understand why even theoretical maximum efficiencies are so low.

    There is a huge increase in entropy in creating and maintaining the gleaming product that is in a show room (be it a car or a designer drug). All by itself, it is a highly improbable assemblage of electro-mechanical and chemical wizardry. If you include the full and wide picture as to its origins, the shiny package on the turntable is the most likely thing that could have occurred; it was inevitable, considering what went into its carefully planned formation.

    The thing to watch is not absolute entropy, but instead Entropy Gradients. And not just changes with time, but changes in entropy across spatial dimensions, and in particular gradients perpendicular to horizons, equipotential surfaces and “skins”. This is where we can begin to see how entropy gradients are the origin of force.

    A better understanding of how the number of microstates (complexions) can be constrained or enhanced should be at the beginning of one’s education in physics. Much of the rest of physics could be derived from these modest beginnings. It would give people a finer appreciation of the limits to efficiency and information processing … and perhaps … remove unrealistic expectations.

    .. just a thought ..
    Greetings, Brother Blue-Green. Good to hear from you, as always.

    You cover a lot of ground, but let me highlight a few points that stand out for me. Yes, I agree, the misuse of the term entropy pops up everywhere. Let's just blame it all on entropy. Entropy is spreading entropy everywhere and it is causing a lot of confusion. :-)

    You also point out the distinction between absolute entropy and entropy change. What I wrote about was primarily absolute entropy. I intentionally avoided tackling the "arrow of time" implications of entropy change at this time, so as to lay down a solid foundation of what entropy is, before talking about what entropy does. I intend to tackle that can of worms (now there is an entropy metaphor for you) in a later blog/article. (They say the only way to get the worms back in the can, after you open it, is to use a bigger can.)

    As a preview, I'll just say you are on the right track to talk about gradients. The 2nd law can be stated in many different, supposedly equivalent, ways. The form I most agree with is that spontaneous flow is from high concentration to low concentration. That applies to energy as well as matter. Where things get interesting is when there are competing gradients in opposite directions.

    I would be interested in hearing more about how entropy gradients are the origins of force and what force or forces create the entropy gradients that create the forces. Sorry if my cosmic ignorance horizon is showing. It is just that the more I learn the more ignorant I become. It is not a zero sum game. Order and entropy both increase together and so does knowledge and ignorance.
    Citizen Philosopher / Science Tutor
    I think there is a wider use for the term entropy, now used beyond its thermodynamic origins, not necessarily relating to order/disorder, but to change and changeability

    In politics, for example, it is the 'entropy' in the system which prevents change, ....often, whilst spurious arguments are generated to allow market shares to be manipulated, prior to regulatory changes

    It is this 'entropy' which is possibly leading to the association with disorder, as part of the fear-mongering and spurious argument generated to allow 'backroom dealings' to be made, for profit

    How often is, 'That will lead to Anarchy' used, for example

    A certain Climate Change problem springs to mind

    What is masqueraded as 'Stable' is far from it, due to a falsely generated 'entropy within the system'

    Yep, Aitch, it does seem like the word entropy has crept into just about every field of study (and non-field of study), with its meaning morphing along the way. I think it has something to do with the appeal of scientificism (not sure that is a word), by which I mean sounding more scientific when you really are. I am sure that appeals to many politicians and conspiracy theorists alike. Actually, I think it is testament to the respect people have for science an authoritative source of knowledge.

    In the politics example you cite, I think the better word for "resistance to change" would be inertia, another term borrowed from science. Politicians and others scaring folks by saying things will fall apart and lead to anarchy if we don't maintain strict control, would fit with the notion that entropy leads to disorder. Interestingly, the opposite argument is also prevalent when it comes to the economy. What you hear there, often, is that if we only left the economy alone, that is, left it to the free market forces, then everything would work out harmoniously. This is actually closer to what entropy means in physics, ironically. In physics, maximum entropy corresponds to perfect equilibrium - not what we normally thing of as disorder.

    So will the world end in harmony or chaos? Both are associated with high entropy. Still a hot topic leading to heated debate in cosmology, so to speak :-)
    Citizen Philosopher / Science Tutor
    In the politics example you cite, I think the better word for "resistance to change" would be inertia, another term borrowed from science
    Yes, I concur, as I know Patrick often refers to inertia in politics, but entropy must be a new 'buzzword', methinks....perhaps it goes with the spread of confusion?

    <!--[if gte mso 9]>


    As Henry says, it’s IGNORANCE that rules, even makes the rules.

    It may be soothing to think of increasing entropy in terms of an expansion of the arena of possibilities. Alas, with increased freedoms comes uncertainty. One man’s certainty can be for another IGNORANCE if he is over the horizon, out of the loop or locked out of closed door sessions.

    The collective weight of increasing ignorance has been weighing ever harder on us with every turn of a calendar’s page. What good is the information age if so much critical information is hidden, uncertain or beyond our reckoning?

    The situation is so dire, that the bulk of the total energy of the universe is now deemed to be dark and hidden to us. The mysterious “energy” is a mirage generated by information being hidden from us behind cosmic horizons, be they around black holes or at the limits of the observable universe. Dark energy can be derived from entropy gradients … It is the events that we cannot know with certainty that derive the large scale structure of space-time … so says a dozen papers following Verlinde’s paper a year ago deriving F=ma and much more from entropy gradients. It has been proposed that even the probabilistic foundations of Quantum Mechanics can be based on fundamental ignorance.

    Ignorance rules. We always knew this. Funny that.
    <!--[if gte mso 9]>


    <![endif]-->The spontaneous flow from high concentrations to low concentrations is familiar enough, however, it does not help one to understand how gravitational clumping (from low densities to high) can also be attributed to entropy gradients. Seems to be contradictory at first glance.

    Verlinde explains it well enough in his Jan 6, 2010 paper.

    There has been quite a bit of follow up activity, mostly supportive and broadening the field, yet not as clear as Verlinde’s original paper.

    I think it has legs // fertility.
    Thanks for the Verlinde link, brother blue. I have downloaded it and am working my way through it. I have also tracked down your E is for Elephant and I am working my way through that as well. You are a poet as well as a citizen scientist, it seems.
    Citizen Philosopher / Science Tutor
    Fred Phillips
    Steve, the topics that are guaranteed to generate impassioned comments are religion, evolution, sex, politics, and entropy. (Not necessarily in that order.) You were brave to pick one of them for your debut column!

    Entropy in the form of Sascha's equation above also has an unambiguous meaning in statistics, in which no confusing physical or psychological interpretations are necessary. I'm glad of it! It got me through my dissertation defense back in '78 - I refused to take the bait when my degree committee wanted to take the discussion down wacky sidetracks.

    Entropy also has a clear meaning in Claude Shannon's communication theory, which underlies the Internet
    technology we're using here at Science 2.0.

    Anyway, welcome, Steve - we'll look forward to more of your thoughts on these pages.
    Fred, of course you are right. There is no ambiguity in the mathematics. That is why I told Sascha he was absolutely right with Boltzmann's mathematical formulation. The difficulty, as always, is in the interpretation of the mathematics, that is, what does it mean - what intrinsic property of the system does it measure? On the molecular level it is clear what it means. On any other level, it is not so clear what it means from an observer independent point of view. Whatever it does means intrinsically, I argue that it does not mean "disorder".

    I also agree with your implication that science cannot let itself get sidetracked/delayed by philosophical questions. Science has a full plate and cannot be put on hold until everything is philosophically crystal clear. Being mathematically clear is usually sufficient to continue with the work. In due time I am sure progress will be made on the philosophical questions as well. Of course, science and philosophy inform each other and it is good that they should both strive to make progress concurrently.

    I appreciate your encouragement. I hope to be sharing more thoughts soon. Stay tuned. :-)
    Citizen Philosopher / Science Tutor
    Steve, sorry for not answering you earlier, but entropy usually racks the brain. I think you right with your assessment, and Bolzmann, with current views, cannot have been right with his linkage of entropy with disorder. Thanks by the way, for showing the historical origins of this association, and I would like to add that it was Sadi Carnot, a French engineer, who initiated thinking about entropy. Academia seems to have a tendency to only honour academics, and forget the contributions of engineers and other craftsmen -it's a weak spot of mine, so forgive me that I would like to honour who'se honour is due ;). Back to entropy and disorder. In research on complexity you currently see to mainstreams working on this topic. One follows the more mathematical interpretations of chaos theory, non-linearity etc (e.g the Santa Fe school), and you'll find that many of this research considers complexity to be a mix of ''order and disorder', which often translates to convergence and randomness. In other words, disorder is often randomness. Systems thinker Gerald Weinberg, has been very helpful for me to demonstrate that randomness is something different than disorder. Randomness, in a way, is very orderly in its probabilistic distribution of the various options (i.e. range of diversity in your words), and so there is certain predictability there. Weinberg therefore makes a distinction between organised simplicity, for which analytical approaches work well, while on the other extreme unorganised complexity (randomness) uses statistical approaches. The plane in-between these is that of 'organised complexity' for which we are only now trying to find some tools to help us with. Sociologist Edgar Morin therefore considers the complex systems researchers mentioned earlier, to be working with 'restricted complexity', with which he means that they are only capturing a subset of complex systems. If you make a distinction between ontological and epistemological complexity, then the latter happens when an observer experiences 'uncertainty' when looking into a subject. Uncertainty means that observer does not know certain aspects of a theme/subject, and also has no means of telling whether this uncertainty will turn out to be something very simple/straightforward once knowledge catches up, whether this uncertainty equates to randomness, or whether there is something very ambiguous at work. From this point of view, disorder basically implies an inability for an observer to make an adequate model of that uncertainty. This is something different than randomness, and therefore (epistemological) disorder, is something different than the randomness encountered in gases and other manifestations of unorganised complexity. We only need to look at the way 'disorder' is used in psychology and psychiatrics, to get an idea of what epistemological disorder means: an inability to predict the behaviour/motivation/actions of the person who has this disorder. This behaviour is usually NOT random, but inspired by certain internal models the patient has, that does not match with other people's expectations (hearing voices, being someone you are not, etc) Having said that, I do think that there is a correlation between ontologically complex systems and entropy, in the sense that it becomes harder to maintain a complex system in a high entropy environment.. This means that the higher the entropy, the lower the complexity will be, and therefore the less chance for epistemological disorder (but more randomness). I think that beyond a certain point a LOW entropy will also decrease the complexity of a system (rigidness) and decrease the disorder of a complex system. That would explain why in early stages after the Big Bang relatively simple forms are made, which increase the (potential) complexity of newer forms, until an optimum level of entropy is achieved that allows the most complex forms to be formed. This would also be the place where the most disorderly phenomena are encountered, as the uncertainty is also maximal here.
    Kees, you raise many interesting points and I appreciate your comments. You give us a great deal to think about.

    One quick note on Sadi Carnot, you are right to point out that it all started with him and he should be recognized for his important role. Rudolf Clausius was very indebted to him and historians speculate that the reason Clausius chose the letter S to stand for entropy was to honor Sadi.
    Citizen Philosopher / Science Tutor
    Thank you for showing me the difference between entropy and order/disorder. Many things begin to fit together more easily now.

    A question. If I undestand the above discussion, it would seem that the entropy of our universe was very low just prior to the big bang. True?

    Yes, Frank, that is the general consensus. The 2nd law of thermodynamics says (in one of its forms) that the entropy of the universe is always increasing. That would mean that the entropy at the moment of the big bang would have been extremely low. (To be clear, there is no generally accepted time before the big bang. One says time began with the big bang event. There is one theory that interests me that does extend time before the big back called Loop Quantum Cosmology, built in part on Lee Smolin's Loop Quantum Gravity) Understandably this low entropy at the beginning of time raises questions concerning how something we call a big bang could be considered "orderly" - not only "orderly", the most "ordered" the universe will ever be?

    Jerry Decker raise this very question here at Science 2.0 as one of his Science Questions for the New Year:
    Ninth I would like to see an explanation of how the big bang creation
    event could produce an orderly universe with low entropy. How does the
    big bang differ physically from other explosions where disorder is

    I think the path to understanding is to realize that at the moment of the big bang, not only was there tremendous energy (all the energy of the universe actually), but all that energy was confined to an extremely small volume, in fact infinitely small according to the mathematics. In any case, it is the constraint of space that prevents the big bang from being the kind of explosion we usually picture from our everyday experience. Similarly, it is impossible to have a riot inside a typical Tokoyo subway car during rush hour. When everyone is squeeze up against each other like sardines, there just is not enough room to have a riot, no matter how agitated you make the crowd. (forgive this silly example, but it is what came to mind)

    As space expanded and continues to expand, there is more room to move and this is one explanation of why entropy is increasing. This increased space also allows for more possibilities, including the evolution of more complex structures. So as I explained above, I think it is more helpful to think of entropy not in terms of order or disorder but rather in terms of increasing diversity. The time of the big bang was a time of great restriction but with time, we get more room and more diversity.

    One last item to mention is that I think one should not think of the big bang as being over. We are still in the big bang. The event is not over. The universe is still expanding. It is just that all that energy is now greatly dispersed. "The dispersal of energy" is another way that entropy and the 2nd law are often described. I find this description also very acceptable. The energy in the universe continues to become more and more dispersed.
    Citizen Philosopher / Science Tutor

    Not quite. Prior to the Big Bang, there is no adequate theory. From the moment just after the Big Bang (from what I understand) entropy was extremely low, and it has been increasing ever since.  I thought that Brian Greene discussed this really well in:
    The Fabric of the Cosmos: Space, Time, and the Texture of Reality

    For me the eye-opener was hist statement that we can only delay the increase of entropy, but not decrease it. So if there are pockets of relatively low entropy in our universe, it is only because the increase was delayed with respect to the average level of entropy (at a certain frame of reference, say our current situation at this part of the Universe)
    <!--[if gte mso 9]>


    <![endif]-->If leaps of insight are to be achieved, “we” must first get over the old barriers ….

    Consider the following innocuous seeming statement from Steve (my evil twin):

    “at the moment of the big bang, not only was there tremendous energy (all the energy of the universe actually), but all that energy was confined to an extremely small volume, in fact infinitely small according to the mathematics … As space expanded and continues to expand, there is more room to move."

    Whoa horsey. Whoa. When matter curves space, it means that there is more VOLUME within an enclosing surface than would be dictated by flat Euclidean geometry. For a 2-dimensional embedding image of this effect, picture a tall spire. The shortest path from one side to the other may be short and easy, compared to instead climbing up to the top of the peak and back down ...

    The “volume of the space” around the big bang or a black hole is actually immense with lots of room (Raum in German). In contrast, the AREA of the surrounding horizon may be quite small. The missing area is sometimes called the area deficit. From the deviations from Euclidean expectations (the area deficit), one can calculate the matter contribution given by the Ricci Tensor. For a black hole (and also I think the big bang), the enclosed volume is infinite by classical reckoning; it is a finite number via quantum reckoning. The holographic principle encapsulates this. It may well be that the “volume” of our universe, is constant, and in the same sense that its total information content is constant and conserved.

    As I warned further up, ten thousand teachers professing that increases in entropy can be due to dispersal and expansion caused us to miss the simple fact that gravitational free-fall and clumping is also in the direction of increasing entropy. We need to stop thinking in terms of space, room, and volume and instead focus on area, surfaces and foliations which are in direct proportion to the available microstates … writ on the subway walls …

    Entropy is a measure of the number of internal variations a system can have with no observable difference.

    By observables “we” mean macroscopic parameters that can be felt by a detector … concrete and measurable kicks that fall along the lines of John Wheeler’s and Bohr’s dictate that “no elementary phenomenon is a phenomenon unless it is brought to a close by an irreversible act of registration …”

    No need to get anthropic about this … yet now that I’ve opened it, you’ll never get the worms back into the can … Get a bucket.

    Just sticking my neck out here ….
    The “volume of the space” around the big bang or a black hole is actually immense with lots of room
    Not sure what you mean by "around the big bang". The theory is that the whole universe was condensed down to what is referred to as the big bang. There was nothing "around the big bang". There was just the big bang and it was all very dense. Infinitely dense according to the mathematics. High density implies small volume.

    Johannes, The Hammock Physicist, has a nice discussion about how to picture the Big Bang "explosion" here: Big Bang, Big Bewilderment

    P.S. How did I turn into the evil one? You must have switched universes when I wasn't looking >:-)
    Citizen Philosopher / Science Tutor
    hmmmm. Seems I've stepped right out on to the slippery slope.

    Thanks folks. It's time for thoughtful consideration of these items.

    <!--[if gte mso 9]>


    <![endif]-->While Steve is over the rainbow, I thought I would try to give at least one textbook formula to back up this notion that matter deepens the volume of the very space it occupies. The effect is pronounced only at astronomical extremes, and yet, those extremes do exist with each and every instance of Gravitational Collapse and Singularity. Popular statements that the Big Bang was once no larger than an English pea … raise the hairs on my back.

    What’s the point of using General Relativity to do cosmology if one is going to ignore how it deepens the well around actual matter? If a Stress Energy Tensor Tuv can be directly related to a Ricci Tensor Ruv and its trace R, then it’s not far fetched to have the presence of Matter directly coupled to the amount of available Space.

    For particles like fermions that naturally resist being crowed, a little more space generated by the density of matter is welcomed to relax things. There could be a Principle of Least Action at work in this natural deepening of space associated with matter. Hilbert realized right away that one could derive Einstein’s field equations by minimizing the action of R. This is all very old hat; yet, I could not Google a clear exposition. The basic formula that I am going to put down is not an integral relation between volume and area. It is instead a relation for the purely special spatial part of R defined at each point.

    The Euclidean relation for the Volume/surfaceArea of a unit sphere is 4/3(pi)R^3 over 4(pi)R^2 = 1/3 (in whatever units for length one chooses to use).

    If one starts cramming conventional matter into this unit sphere, the Volume/surfaceArea ratio is going to start increasing (per Einstein's Theory of Invariants).

    I would like to see a calculation of the V/A ratio for the core of a neutron star. I am under the impression that V/A rises to infinity when gravitational collapse is attained. My reason for believing this is because the curvature scalar R goes to infinity when the density of matter goes to infinity at a singularity.

    In principle, for any observer, one can foliate a spacetime into a sequence of space-like slices. The scalar R can be calculated at each point on each slice using the following formula:  equation 21.50 in Gravitation by Misner, Thorne and Wheeler.

    The equation is a bit difficult for me to post here, so I am going to break it up into parts. In the numerator there is a calculation of the area deficit between the matter-free Euclidean expectation and the actual area (which is smaller) due to the presence of matter. From the (non-singular) Point P at which one wishes to calculate R, one fans out  geodesics in every direction a distance r to outline a sphere of radius r around P. Let the physical Area of this “sphere” be given the symbol A(r). The Euclidean calculation for its area is E(r) = 4(pi)r^2. Equation 21.50 in Gravitation now
    reads as follows:

    R(at point under study) = Limit as r shrinks to zero of 18*[E(r)-A(r)] over 4(pi)r^4.

    Not much room here now to explain what this has to do with available microstates and stuffing people into subway cars. Sorry about that.
    Wow, blue- green! (what an interesting user name that is)

    That last entry practically begs for more discussion. However, since I’m the ignorant one here, I’m going to pass on your invite to go fishing. Even if you do have a whole bucket of bait.

    I think I should stop using the term ‘big bang’. I can’t quantify ‘big’. And, don’t believe there was a real ‘bang’. So, I’m going to use the term Initialization instead.

    So, now I have a problem of where to start my journey. I have a choice of Before Initialization, At Initialization, or After Initialization. There seems to be a camp that advocates something Before. Lots of folks talk about the first few instants After. I’m going to focus on At Initialization.

    Just for the sake of having someplace to start, let’s say that At Initialization, S=1. Now, let’s say we have a sphere, of whatever size you desire, containing matter. Matter made up of both fermions and bosons seems logical to me. Hence, there was mass and therefore energy.

    Something happened At Initialization. That something started a chain reaction. The result of which is everything that we experience today.

    Disorder <- - - > Chaos - -> Entropy


    How am I doing so far?


    If you are saying these three all require S>1 I could go along with that. But probably it is better to consider disorder a poor choice of words for what can better be described with 'randomness' or so.

    If you would want to formalise disorder, my guess would be

    1 < Sc <Smax, where Sc is that entropy where complexity is maximal (and with it disorder)

    Yes, it was a poor choice of wording but you caught my drift.

    "High density implies small volume."

    You still sure about this Steve?

    .. your evil twin .. first born.
    Not sure I follow you, big blue. For a fixed amount of energy/matter, higher density means smaller volume. Are you thinking of "space surrounding the universe"? There is no such thing.

    The question that was raised was how can a big bang have low entropy. Big bang sounds like some kind of explosion and we ordinarily think of explosions as having high entropy. My answer was that the big bang was no ordinary explosion. For one thing, the whole universe was condensed down to a very high density state - a very, very, very high density state. You are not going to have a whole lot of entropy in that state, not for lack of energy, but just for lack of "room". There are not many different states the universe can have under those conditions.

    Are you suggesting there was no high density, or that there was high density but it wasn't due to smaller volume? Are you saying there was more mass and energy? What is your explanation for the low entropy?

    Of course, things might work differently in your parallel universe. That would explain everything. :-)
    Citizen Philosopher / Science Tutor
    If it was a very low density situation, then the only thing I can think of to explain the low entropy is that the matter was very close to zero Kelvin.

    Even though the big bang made its own space, that does not prevent mathematicians (or poets) from embedding it in ever larger and unphysical arenas so that one can in a sense step outside of the whole shebang and look back ... retrospectively.

    blue-green here with all of the normal formatting difficulties in reaching Steve's universe ... details to follow per popular (or unpopular) demand ..... been out of town ...

    I had hoped that the mathematics would speak for itself ...(yet it's always mute).

    At initialization, LOTS of Room because high density creates "Room", Raum, Space.

    At initialization, high temperature extreme.

    At initialization, Ricci Tensor components are infinite (using purely classical reasoning).

    At initialization, contorting Weyl Tensor components are near zero (Penrose Hypothesis).

    At the far temporal extremes of the big bang singularity, for each black hole singularity, the Weyl components are infinite (along with the Ricci components). Quantum reasoning (aka Holographic Principle) puts a finite bound on these components, because even for a black hole, the entropy can never be infinite (unless the area of its horizon were infinite).

    No real, actual or physical infinities in physics. Just mathematical ones.

    At Initialization: entropy S = 0.  Therefore, the # of available microstates = (natural log of S ) = 1.

    We were once ONE United State of absolute certainty, 100% probability, AND with an infinite amount of room or potential for expansion, thus, extremely high temperature right after initialization.

    "We" do not know the exactly nature of this initial cosmic egg.

    If would be interesting to see a proof that it cannot be recreated within its own universe.

    If we had such a proof (I think it is possible), then a corollary of it would be that no particle accelerator here or in the heavens can be at risk of creating a "big bang". Another corollary would be that since it cannot be recreated within itself, then it cannot be studied using the standard paradigm of the scientific method. No repeatable trials etc.

    Yes Steve, I know the big bang makes its own space. Immediately after initialization, one could mathematically draw an enveloping surface around ALMOST everything. In the limit, its Area would be quickly inflating from zero ... The Area of this cosmic horizon would be in sync with the same cosmic measure for entropy that starts at zero and increases exponentially.

    “Entropy quantifies the number of different microscopic states that a physical system can have while looking the same on a large scale.… The new entropy calculation also highlights a cosmic puzzle, Carroll says. The entropy was relatively small in the early universe (1088), bigger now (10104), but still falls far short of the maximum (10122).”

    Sidebar: the usnews link gives more than enough "entropy is disorder" insinuations to make Steve bang his head on a wall. The arXiv article that it cites does not make these errors.
    You rule blue. I agree with your math construct.

    I don't think S=0 because theory says there would be no quanta activity. Well, it could be 0 and an external force caused the reaction to start. Two fledgling universes collide?

    Carroll. “The universe is incredibly more orderly than it has any right to be."

    And yet, the cosmic egg is not yet broken.

    I agree completely. The entropy is a quantitative property that is measured and tabulated in the thermodynamic tables and no one knows what disorder is. So, I guess that first one has to define what disorder is quantitatively. Another argument against such a comparison (I will take it from my recent post If we say that the entropy is disorder than its derivative in temperature characterizes how disorder changes with temperature, for example Cp/T = dS(T, p)/dT Thus heat capacity has something to do with disorder as well. Now we should remember that Cp = dH(T, p)/dT Hence the enthalpy and the internal energy have to characterize disorder as well, as they are related to heat capacity.
    After sleeping on it, I would like to add that at “initialization”,
    there is no a priori reason for entropy S to be right at zero.
    It could just be a low number, a potentially very important low number, as it would tell us the minimum number of accessible microstates for the whole shebang. There could be a way of deriving this number for certain toy models (a challenge perhaps for some Grand Unified Theory). Absolute values are tricky; 'tis enough usually to work with gradients ...
    The Third law states that the change of entropy in any process is zero. Well, it also contains the statement that the derivatives of DelS approache zero as well. In any case, one can choose an arbitrary constant at zero Kelvin for the entropy and this will change nothing in chemical thermodynamics. Yet, with S = 0 at O K everything is just simpler but this is not what the Third Law says.. I am afraid however that you cannot use the thermodynamics laws for Grand Unified Theory. Conventional thermodynamics (classical and statistical) is based on the assumption that energy and entropy are additive U = Sum_i U_i S = Sum_i S_i I guess that for Grand Unified Theory this is too restrictive.
    Yes, Evgenii, I agree. One must be very careful about applying equations derived from thermodynamics to the universe as a whole. In thermodynamics there is always assumed to be a system and it's surroundings. (Even an "isolated" system is defined as being isolated from a something - the surroundings - and there are boundary conditions established because of this.) The universe as a whole has no surroundings and no boundaries. (and great avatar, btw)
    Citizen Philosopher / Science Tutor
    @ Evgenii. Thank you for your comments. I read your article,-and-artificial-life.html. It is very good. You and I make essentially the same point - entropy as defined by thermodynamics is an objective property of the system and not observer dependent. You present your case from a slightly different perspective. I very much like your distinction between thermodynamic entropy and information entropy. I was not aware of the history behind the concept of information entropy and its unabashed subjectivity. Perhaps it would help if we had different words for the two different ideas of entropy. I would suggest "t-entropy" and "i-entropy" for thermodynamic entropy and information entropy, respectively. I encourage readers here to read your article for this added perspective.

    @ blue-green. Sorry, my friend, but I am not an expert on the big bang, so I will leave it to others here more knowledgeable on that particular subject, to check your reasoning. As far as I can tell, it does seem to follow the general consensus that the beginning was very dense, hot, and homogeneous and that the entropy was at a minimum, but not necessarily 1. Perhaps our Hammock Physicist would like to comment. Johannes has a number of articles posted here that may be of interest including, Big Bang, Big Bewilderment.
    Citizen Philosopher / Science Tutor
    Evergenii’s article makes the unfounded charge that “information” as measured in the information sciences and statistical mechanics is “subjective”. The equations that I have seen (including the one given by Sascha from wiki in the first comment here) do not look subjective to me. They are no more observer-dependent than is say the PV=nRT equation of thermodynamics. On a more cosmic scale, there is nothing observer-dependent in whether or not matter has plunged into a black hole and added to its entropy (in the strictly additive way that does so). One can warn that applying thermodynamics to the cosmos as a whole is misguided, yet that isn’t going to alter the flow of such papers from being published at arXiv … ‘Tis better to pick battles one can win, than to dissipate one’s energy ... Foreign Affairs: 101. Johannes is my man, and yet, our paths are skewed and interaction-free, neither parallel nor intersecting.
    You are right that for example Shannon has defined information as an objective quantity. Although In papers that I have cited, people have talked about subjectivity. So I do not understand why you call my statement unfounded. It might be incomplete though, I agree. However, whether information is objective or subjective, this changes nothing. The thermodynamic entropy has nothing to do with it. Please just open the JANAF Tables and show what Shannon's information has to do with entropy values there.
    What does Shannon entropy have to do with the entropy values published in chemical tables? Well, for one, the full theory can be used to derive specific heats. There are experimental values of entropy (derived from calorimetry and measures of specific heat), and then there are theoreticalvalues going all the way back to Boltzmann. As I recall, more than a little of Einstein’s fine work in 1905 was devoted to deriving experimental values for specific heat using one form or another of an atomic hypothesis. Or to put it differently, he used experimental values of specific heat, entropy and viscosity to glimpse into a granular and quantized underworld, which in his day one could only see in one’s imagination with the aid of statistical mechanics. I do not know how dedicated Steve will be in to trying to keep thermodynamic entropy “t-entropy” apart from informational entropy “i-entropy”. I am aware that others have tried to do so. My thinking is that the theoretical methods for measuring entropy are always going to look and feel different from the experimental ones, and yet they are simply different approaches to the same deal, same animal.
    >Well, for one, the full theory can be used to derive specific heats. I am puzzled. Do you mean that one will compute specific heats from information? In my view we should first make definitions. There is the entropy in statistical thermodynamics (the Boltzmann equation) and entropy introduced by Shannon. Please note that Shannon just used the term entropy without claim that this is the thermodynamic entropy. Let me quote him "The form of H will be recognized as that of entropy as defined in certain formulations of statistical mechanics8 where pi is the probability of a system being in cell i of its phase space. H is then, for example, the H in Boltzmann s famous H theorem." He just shows that the equation is similar but he does not make a statement about the meaning of such a similarity. After all, there are many cases when the same equation describes completely different phenomena. For example the Poisson equation for electrostatics is mathematically equivalent to the stationary heat transfer equation. So what? Well, one creative use is for people who have a thermal FEM solver and do not have an electrostatic solver. They can solve an electrostatic problem by using a thermal FEM solver by means of mathematical analogy. Could you please clarify your claim? What did you have meant? If you speak about using statistical thermodynamics to evaluate thermodynamics properties, then this is already in the JANAF Tables when especially for gases experimental information about molecular constants have been used to estimate entropy. Yet, what the Shannon entropy has to do with this?
    Evgenii asks: “Do you mean that one will compute specific heats from information?” It is more like one can compute specific heats from a full accounting of the energetic processes involved in forming states of information. Steve asks: “When you talk about entropy and the amount of information that can be stored in a computer, are we talking about how hot the computer will get?” Yes in the sense that to get that information onto a hard drive, it has to be plugged into a source of power and there are big heat sinks involve -- thermodynamic limits to the maximum attainable efficiency that hearken all the way back to Carnot. As I already quoted, “no elementary phenomenon is a phenomenon is without an irreversible act of registration.” I defer to Maxwell's demon .... I need to play the weakling now and insist that I do not have time to hash out the details in a comprehensible manner. Already have a day job and and a night job outside of physics. On a more humorous note, last night I clicked on Evgenii’s website link and right at the top was a Las Vegas link to a site on “Quantum Jumping”. I did not realize that it was a momentary link put up by Ads by Google. It is now gone. Here is what I saw: Jump Into a Universe of Infinite Possibilities. The billionaire you. The inventor you. The athletic you. In alternate universes, everything you desire has already taken place. Tap into this infinite potential with Quantum Jumping. Enter your email address to get instant online access to the free 6-part Quantum Jumping Introductory Course. No Credit Card Required

    If to speak about ads, it is really a creative industry. Say at Science 2.0 a couple of times I have seen an ads "You have one email. Click here." In order to entertain you a bit with thoughts about information, I will quote Jim Holt (I took it from "Take that rock over there. It doesn't seem to be doing much of anything, at least to our gross perception. But at the microlevel it consists of an unimaginable number of atoms connected by springy chemical bonds, all jiggling around at a rate that even our fastest supercomputer might envy. And they are not jiggling at random. The rock's innards 'see' the entire universe by means of the gravitational and electromagnetic signals it is continuously receiving. Such a system can be viewed as an all-purpose information processor, one whose inner dynamics mirror any sequence of mental states that our brains might run through. And where there is information, says panpsychism, there is consciousness. In David Chalmers's slogan, 'Experience is information from the inside; physics is information from the outside.' But the rock doesn't exert itself as a result of all this 'thinking.' Why should it? Its existence, unlike ours, doesn't depend on the struggle to survive and self-replicate. It is indifferent to the prospect of being pulverized. If you are poetically inclined, you might think of the rock as a purely contemplative being. And you might draw the moral that the universe is, and always has been, saturated with mind, even though we snobbish Darwinian-replicating latecomers are too blinkered to notice."
    I don't think there is a conflict between entropy in classical thermodynamics and entropy in statistical thermodynamics. It's still thermodynamics. Boltzmann and Einstein were working on explaining molecular motion and its relationship to heat and temperature. The question that I think Evgenii raises, and I also wonder, is whether, when you start using the term entropy in other contexts, are you really talking about the same thing? When you talk about entropy and the amount of information that can be stored in a computer, are we talking about how hot the computer will get? Are we asking how much heat is stored on the computer at a given temperature and how much of it can be converted to mechanical work? I don't think so. Can it be confusing to use the same word for both situations? Yes, I do think so.
    Citizen Philosopher / Science Tutor
    Fred Phillips
    I do think there's a conflict, Steve. There's no ambiguity about entropy in classical thermodynamics. All later applications of the entropy concept, including Boltzmann's and Shannon's, require the delineation of states. That delineation is judgmental and hence anthropic. Boltzmann may have thought his definitions of states were obvious, but Mother Nature may have had a different idea of obvious. And we know old Ludwig was a frustrated and disturbed man.

    Shannon knew his measurement of information transfer said nothing about conveying meaning. He was careful not to imply otherwise. In the case where 'states' were letters of the English alphabet, he ultimately treated cases of interchangeable letters, the differing probabilities of one letter following another, etc. Entropy was relative to the code used, and relative to the chosen base of logarithms.

    Sascha's equation is unambiguous on its face, but begs the question of defining the states "i". For a given coding, we can use the equation to design a telecomm system that provides the needed bandwidth at minimum cost. However, as Shannon knew, the motives people have for sending messages, how they interpret messages, etc., are important to the system but not analyzed by the entropic equation.

    One answer to your computer question - that it would depend whether it was a vacuum tube computer, a transistor computer, or a quantum computer - would be glib, hiding what is probably a more profound question. However, it's not necessary to answer it, because the heat generation (that is now a major environmental problem for us) comes from accessing the data, not from storing it. Or sometimes, from making it ready to access, e.g. by keeping the hard drives spinning in a server.
    <!--[if gte mso 9]> Normal 0 <![endif]-->The same can be said of “power”, “spin”, “charge” and so many other fundamental words in physics. The only way to be rigorous about it all is to pretty much ignore popular usage and Aristotelian philosophy and stick to mathematical physics and experimental measurements. The irony here is that Sascha, the drama queen of “no mathematics please”, is the only one here whose comment is purely mathematical. Go figure. You were right on Steve on focusing on the position of the center of mass in the falling plank in Sascha’s rant. As it is shifted, depending on the plank’s construction, from the hinge to the far end, the full range of possibilities is covered.
    So i came on this site because of the second law of thermodynamics, and i found your analysis.

    My question was: How does "spontaneous" self assembly and self ordering "fit" with the second law of thermodynamics?

    Or to put it amother way " is disorder increasing in the universe, or are we ignoring the patently obvious electromagnetohydro dynamics plus gravitationaldynamics?

    It seemed to me that rather than interpreting enrtopy as disorder we needed to look for something that is obviously missing in the thermodynamic analysis which explains the manifest order and tranformations of order within our observable universe.

    I wondered why , apparently, those who are investigating heat energy do not join up with those who are investigating other forms of energy to form a complete cybernetic "picture" of the whole?

    I will read your article with interest and then make further comments, but at this time i came to ask my question(s), and that i have done.

    However, as my wife observes their is my "order" and her Order! I must admit her order somehow seems more ordered! lol! it is a perceptual thing.

    In order to compare the entropy and order(disorder) you need to define the latter first. Otherwise this is tautology. Recently I was recommended an interesting book: Entropy and Art by Arnheim, and I really enjoyed it. You may want to read Arnheim as well, by the way It should be available in Internet.

    These are very interesting points you are making, and have been one of the focuses in my own research in complexity. Complexity recognises the formation of order (which is a tautology, I know, because 'form' implies 'order') despite entropy. Stuart Kauffman in 'Investigations' addresses your point quite extensively, and demonstrates that the increase of order always has a local character (e.g. solar system -> earth -> biological organisms -> social structures). Physicist Brian Greene tells us that increase of entropy can at best be 'delayed', which means that complex forms basically 'harvest' low(er) entropy resources and concentrate them. This does not mean that complex forms can actually reverse the second law of thermodynamics.
    I would have a question to the definition of the complexity. let us consider a simple exercise from classical thermodynamics. Problem. Given temperature, pressure, and initial number of moles of NH3, N2 and H2, compute the equilibrium composition. To solve the problem one should find thermodynamic properties of NH3, N2 and H2 for example in the JANAF Tables and then compute the equilibrium constant. From thermodynamics tables (all values are molar values for the standard pressure 1 bar, I have omitted the symbol o for simplicity but it is very important not to forget it): Del_f_H_298(NH3), S_298(NH3), Cp(NH3), Del_f_H_298(N2), S_298(N2), Cp(N2), Del_f_H_298(H2), S_298(H2), Cp(H2) 2NH3 = N2 + 3H2 Del_H_r_298 = Del_f_H_298(N2) + 3 Del_f_H_298(H2) - 2 Del_f_H_298(NH3) Del_S_r_298 = S_298(N2) + 3 S_298(H2) - 2 S_298(NH3) Del_Cp_r = Cp(N2) + 3 Cp(H2) - 2 Cp(NH3) To make life simple, I will assume below that Del_Cp_r = 0, but it is not a big deal to extend the equations to include heat capacities as well. Del_G_r_T = Del_H_r_298 - T Del_S_r_298 Del_G_r_T = - R T ln Kp When Kp, total pressure and the initial number of moles are given, it is rather straightforward to compute equilibrium composition. So, the entropy is there. What about the complexity, order and disorder?
    Here we enter the wonderful and very ambiguous rea;ms of definitions ;)
    Some complexity theorists will define the complexity of a system in terms of information content, where 'order' is related to the minimal description that is needed to describe the (outcome of the) system.

    In other words, the complexity of the system you describe would equate, or be reciprocal, to the equations you need to describe the proces. The outcome of the process would have a 'high' amount of order in equilibrium and, depending on the definition you choose, the initial 'chaotic' stage, because randomness can also be considered fairly orderly, in the sense that every random sitiuation is replacable with others. As the process starts, the complexity of the system then increases until the system starts to stabilise. In equilibrium the complexity is low.
    Order /Disorder is more interesting. Some definitions will say that the initial, chaotic state is disorderly, and the order increases as the system moves into equilibrium (high order) .
    I said earlier, that I think probabilistic disorder (i.e. randomness) is not really disorderly, so I would prefer to consider this stage to be orderly as well. Gerald Weinberg has a great graph where order and randomness are functions of each other, and complexity lies in-between these.

    I myself follow Weinberg's idea, but with a detour. In my definition, complexity is related to the amount of uncertainty an observer has to precisely describe the system at hand. In an initial stage, a system will have a high degree of randomness, and this is a fairly evident situation for the observer: all the different states (descriptions) of the system are comparable, so the system is orderly (for that observer). The same applies for the equilibrium state. In the process in-between, the problems start, because the complexity / uncertainty of the process will depend very much on what the observer can describe. If there is a linear progress from randomness to equlibrium, or another pattern that can be captured in equations, then the system will not become more orderly/complex. If however, the equations only approximate the factual progress or give a description of the progress on average, then the complexity / disorder (these are not the same!) will increase (the observer does not know exactly what the process is doing at some point, until the system moves towards equilibrium. The system becomes even more complex if the various routes the process can take can result in different final situations (for instance different types of equilibria). At this point, small chance effects may influence the outcome, and thus the uncertainty and hence the complexity of the progress increases until one of the possible final states start to emerge.

    As you can see, this defintion hold a relationship between the process/system you are observing, and the qualities of the observer to know that system. For this reason, some prefer to make a distinction between ontological and epistemological complexity. Ontological complexity depends on the charactersitics of what you are studying (e.g. my mortgage is a complex financial product), while epistemological complexity relates to the abilities of the observer (the mortgage is more complex for people with little education in finacial matters, than for a trained financial advisor. This kind of complexity / order does not necessarily say anything about the process under investigation.
    This was theory. In order to understand it better, could you please apply it to my problem with N2 + 3H2 = 2NH3?
    When i did chemistry at secondary school we never had the advantage of electro-thermodynamic analysis a la Feynman, and when i was being taught to be a financial advisor the rule was "Keep It Simple Stupid!" lol!

    So coplexity and order/disorder/chaos remind me of the observation an anthropologist made about the mathematical ability of a certain tribe:" they have no concept of number beyond 2: for once they count 2 the word they use for any larger value denotes "many" ,,,"

    I have always remebered that, particularly in the light that further studies showed how cultural bias, and presupposition actually affect perception of the observer.

    If there is no real definition of entropy in terms of order, then we flat out need to say so and be done with it. i am quite happy for the entropy of a closed system to increase ad infinitum if that is the case, as what that means appears to be guesswork. But for an open system, surely we need to look for the other components in the system and see how they relate..

    I am interested in this observation that locally self assembly in a system occurs without deflecting the overall 2nd law. However, where is the symmetric observation that self assembly proceeds undeflected by the 2nd law?

    Has anyone put a table of entropy of sets of self assembled units up for view, so a comparison can be made between the entropy level and the self assembled units? Can we actually observe/calculate a lower entropy in self assembled units?

    Rather than defining an abstract notion define a state of Avogadro numbers of "particle-waves", and vary the entropy of the state to record what states result.

    For Example the Einstein Bose condensate was achieved in stages. At each stage calculate the entropy and establish a scale of state to entropy.

    This may be a better way to discuss this topic, i do not know. What do you think?

    [quote]the system, T. In 1876, J. Willard Gibbs named this useful energy released as "free energy" and provided the formula to calculate it. The free energy, delta G, was the change in chemical energy, delta H, minus the trapped, thermal energy, T times delta S.
    delta G = delta H - (T x delta S)[/quote|

    The short Question is why not delta T?

    The analysis to this point is dogged by the foreknowledge that S is not going to be a constant. We have a constant called a specific heat for a material. Why are the 2 distinguished or confused?

    If the basis is a kinetic theory then some kinetic change is being intimated and this naturally leads to a increased kineticism to explain temperature. Why do we need an increased entropy/ specific heat?

    There seems no useful purpose under this analysis for entropy being different to specific heat.

    Now if the tautological goal from the outset was to explain order from disorder then we have an ulterior sabotaging motive.

    However, how do we explain differences in specific heats for substances? Do we need entropy for that? in which case conservation of energy rules apply and the total kinetic energy would have to change with the increase or decrease of specific heat, thus a temperatue change will ensue, meaning we cannot change one without the other. We would therefore experience a delta t with any delta s for a fixed heat content. The extra or missing heat would be absorbed or emitted by bond energies.

    The equation Del G = Del H - T DelS is considered for a chemical reaction at constant temperature by definition (G = H - TS is valid when all values are related to the same state). The relationship between the heat capacity and the entropy is as follows dS(T, p)/dT = Cp/T dS(T, V)/dT = Cv/T I should say that I have not got your point.
    I think you have answered my point, which was in the analysis of the free energy it assumed that the temperature was constant, but i was thinking of the dynamic situation which produces the free energy, in which of course nothing is constant. Thus why choose change in entropy, why not change in temperature.

    I have not completed my analysis of the article or the topic, so my responses are as they come to me, and sometimes daft questions throw light on assumptions not stated eh?

    The fuller equations you have given are what i would expect to see.

    My question about specific heat remains, because it seems to me that specific heat is what is being called entropy. The statistical machinations of various states are a bit obtuse to say the least. If we wnat o apply a probability description we need to know what the probability is about, and if we want a statistical one we need to know what the population is. These do not at this moment seem well defined.

    However a kinetic theory seems to adequately explain the specific heat, and the issues of pressure and temperature change also. Why do we need entropy?

    I think the suspect idea is that specific heat is a constant for an element instead of dependent on state , ie phase. Thus the specific heat for water will differ as it changes phase, and hus may give a marker of state or phase for a compound.

    For the more dynamic situation of a chemical reaction with reaction products we may need a complex form of specific heat to properly account for the energy transformations. The point is that it is just specific heat, not some other notion we may be calling entropy.

    Free energy is by definition F = U - TS, or G = U + pV - TS (depending which one you mean by free energy). The equation is related to a given state. It does not forbid us to apply such a definion to a process when temperature changes. If necessary this could be done without a problem. Heat in general is not a state function, hence in a way there is no such thing as specific heat. A system has energy that could be released either as heat or work. Here is a big difference with the entropy that is a state function. This means that the state 1 has some entropy S1 and the state 2 has some entropy S2. The change in entropy between the two states does not depend on the process by which you transfer the system from 1 to 2, Del S = S2 - S1. The heat on the other way depends on the process Q = Integral from 1 to 2 dQ ds (where s is the change in some variable that defines the process) and Q does depend on the way you go from 1 to 2.
    Should we be pessimistic because of the 2nd law of Thermodynamics?
    I think Not! The formulations though interesting premise on motion being caused by heat. However the kinetic theory is premised on heat being an attribute of motion as is temperature. We can attribute no causative relation between heat and motion which is not of itself symmetric. Thus equilibrium always results but there is no upperbound on motion, thus no stable Equilibrium obtains universally.

    In this case we need not be pessimistic as somewhere motion will continue to effect relativistic change and complexity will continue to increase.

    Ps i know about the speed of light, thank you.

    This article really save me from confusion regarding disorder, But two questions come to my mind:

    So, now I know that the classical meaning of entropy is " the relationship between the temperature of a body and its heat content". Hence for a fixed amount of thermal energy supply, the rise in temperature will be proportional to entropy (assuming entropy is constant in this case). In the sense entropy is just like heat capacity. Are the two notion actually interchangeable?

    Also to help understand entropy as "number of ways molecules could move", I imagine two boxes, A and B. Molecules in A could only move vertically due to whatever reason, while molecules in B could move vertically and horizontally. Since B has more ways to move, does it mean B has higher entropy? And how does the number of possible ways affects the average KE of molecules, if the heat supply is the same in the two boxes?

    The entropy is the state function and the heat is not. This means that when you integrate over some close path the change in the entropy will we zero. This is not held for heat. So you need to be carefull with heat capacity, if you define it formally C = dQ/dT then it is not a state function as well. If you however for example consider C_p, then at constant pressure Del S = Integral from T1 to T2 Cp/T dt So the entropy is not the heat capacity anyway. It just has the same dimension. The second question concerns statistical thermodynamics. Indeed the heat capacity will be different as you have 0.5R per degree of freedom. Hence the entropy will be different as well - see the intergral above.
    Hello Lucas, glad you found the article helpful. Let me try to answer your two questions.

    First, entropy is not the same thing as heat capacity, but they are related. The heat capacity of an object or system tells you how much heat you have to add to change its temperature by one degree. In other words, heat capacity gives you the relationship between heat going in and temperature going up. Graphically speaking, heat capacity is the slope of the line relating heat and temperature for an object or system.

    Now, you should take note that the heat capacity for an object or system may not be constant over a given temperature range. That is, curve relating heating and temperature may not be a straight line. So the heat capacity of an object or system at a given temperature is the slope of the curve at that point/temperature, i.e. the instantaneous rate of change, i.e. the 1st derivative at that point.

    Entropy, on the other hand, is a kind of average heat capacity - an average taken over the temperature range from absolute zero to its present temperature. If you could keep track of the heat needed to raise the temperature of a mass from absolute zero to a particular temperature and then divided by that temperature range, that would be its entropy.

    Multiply entropy by temperature (in Kelvin) and you get the so-called heat content (how much thermal energy is trapped in a system at that temperature). On the other hand, multiply heat capacity by a temperature change and you get the amount of heat energy (in Joules) needed to cause that change in temperature.

    Now let's consider your two boxes. In box A there is only vertical motion possible. Think of this as perhaps a set of oscillators or springs that can only oscillate up and down. In box B we have some other kind of spring that can oscillate up and down and left and right. If we have the same number of molecules/springs in each box, but those in box B have more ways to oscillate (modes of oscillation) then as we distribute kinetic energy throughout each box so that each mode of oscillation exhibits the same average kinetic energy, we will have to put more energy in box B than into box A.

    It's like snack time at preschool. If lemonade is to be distribute so every cup has the same average amount and room A has 5 cups to fill and room B has 10 to fill, then room B will be holding more lemonade than room A. The more cups you have to fill (the more ways you can hold kinetic energy) the more lemonade/energy will be required. Room B may just have more kids or there are the same number of kids, but for some reason each got two cups. What matters is the total number of cups (modes of oscillation).

    So for example, a box filled with a monatomic (single atom) gas has less entropy than a box filled with the same number of molecules of a diatomic gas. This is because a diatomic molecule can move/oscillate in ways a monatomic gas cannot. The bond between the two atoms in the diatomic molecule acts as a spring that the monatomic gas does not have. So as you put energy in, the "places" the movement can be stored is greater in box B with the diatomic gas than in box A.

    Hope that helps.
    Citizen Philosopher / Science Tutor
    Hello Steve, now I finally figured out what entropy and heat capacity is: heat capacity tells how much energy is required to make certain change in something (temperature), while entropy tells how much energy is stored when a thing has certain temperature. The difference is "energy to change things" vs "energy stored". You know what, I feel like I just skipped months of inefficient intellectual wandering.

    But regarding the attempt to explain entropy as ways of movement, the lemonade cup metaphor has some issues to me. If the gas is molecular, I could imagine "having more ways to move" as energy stored inside inter-atomic spring of molecules instead of causing the whole molecule to fly faster.

    But for mono-atomic gas this doesn't work, I can't explain where the supplied energy goes other than increasing average K.E.. Let me elaborate this with my Box A Box B example above. Say each box has 10 atoms and each box receives 10J of thermal energy. After reaching equilibrium, won't both boxes gain an average K.E. of 10J/10=1J? To me, being able to move in more ways (ie moving horizontally) doesn't affect the energy they can receive. How do atoms gain energy without flying faster? May be one could say some energy is used to increase electrons' energy levels, but I doubt it is necessary to think this deep in the case of idealized point-like atoms.

    Lucas, you are in fact correct to notice that a monatomic gas has only one way to absorb kinetic energy and that is because in fact it only has one "way to move". It can fly through the air. It can't tumble like a stick; it can't vibrate like a spring; it's can't wobble like a diving board. A diatomic molecule, on the other hand can move/oscillate in all these additional ways, due to there being a bond that can flex and stretch and due to the extended shape of the molecule.

    Restricting the direction of flight of the monatomic gas to only vertical is not changing its kind of motion. Flying vertically is not really a different kind of motion than flying horizontally. After all, these are relative terms - vertical relative to what? Vertical in one frame of reference is horizontal in another. So in fact the entropy of two boxes filled with two monatomic gases would be the same as long as they contained the same number of atoms.

    Perhaps the missing qualifier is "simultaneous", as in "simultaneous ways of moving". In box B, you allow an atom to move both horizontally and vertically, but of course the atom cannot simultaneously travel perfectly vertically and perfectly horizontally at the same time. The atoms in box B can only fly in a straight lines, just like the atoms in box A. (A diatomic molecule, on the other hand, can simultaneously fly through the air and tumble and wobble and vibrate in and out, all at the same time.)

    P.S. I hedged a little in my first response. I did not mean to imply your two boxes would have different entropies if filled with only monatomic gases. That is why I introduced the idea of vertical and multi-directional "springs" instead of talking about atoms. Like I said before, what counts is the total number of oscillators, which may or may not be the same as the total number of particles.

    Citizen Philosopher / Science Tutor
    If I get some theology in your science, will you promise to get some science in my theology? ;)

    If causality merely determines the arrow of time, and entropy is that increase; then increase in entropy from origin is increase in time from origin, no? The definition I picked up somewhere along the line is: entropy is a measurement of the utility of energy to do work. The more entropy in a system, the less that system's energy has utility. Like a battery. Digital cameras that take a double-A require one that is fresh. Hand-held poker games will work with the double-As in the bottom of the kitchen drawer. Thus the fresh battery has a lower entropy as it can power a variety of devices; while the drawer battery sometimes needs a beating (from I, a source outside the system) just to spin up a pair of kings.

    Not a state of order, per se, as a state of ordering. Why I disagree with the stacked deck having the same entropy as the scattered deck; its entropy must be greater. A system "outside" of the deck of cards is required to stack the deck, transferring useful energy (like drinking a Coke) into a more defined, less broadly utile, form (like dealing the cards)

    Or, if not time, hydrogen. The general consensus on origin is quark/gluon plasma - that had to get away from itself (those words, of course are not the gc) in order to condense into useful matter like hydrogen. Last I heard, everything in the known universe that is not hydrogen, derived from three percent of the original supply. If I remember correctly, this kind of thing was expressed more elegantly in an earlier post - the tendency of the universe is towards increasing complexity.

    If the sum totality of the early universe can be succinctly and completely described as "a bunch of hydrogen;" but now we cannot reach common consensus on the meaning of a word to differentiate one measurement from another, is that not "greater disorder?" Of course, the math works fine; but how much hydrogen was used by the universe in order to formulate this - Del S = Integral from T1 to T2 Cp/T dt? For mathematicians familiar with the language, that equation represents energy of utility; whereas it is telling me I need to brush up on stuff, and for the majority it is meaningless glyph.

    All of which is only to say I disagree with the philosophy "entropy is not disorder." For myself, I believe in "original sin" - that entropy is not the messed-up theory we got from observing time; rather, time is the messed-up theory we got from observing entropy. ;)

    On a more relevant note, are you familiar with quantum decoherence? What does it mean, to say, uncollapsable wavelength?

    In the example of the deck of cards: Isn’t it correct to say that the Gibbs entropy is the same but the Shannon entropy is greater in the shuffled deck?

    In the example of a living person Vs a dried up corpse: isn’t it true that a living person has lower entropy than the same [entire] person dried up - including the 60% of his mass that is water now dispersed into the environment?

    Effectively, entropy has very little to see with disorder. In fact the usual examples in textbooks (e.g., stack of cards in perfect order or a stack of cards in random order) all, have the same thermodynamic entropy.

    I would add that expression dS=dQ/T is only valid for closed systems. For open systems it is dS=dQ/T+(dS)_matter. You can see different definitions for the term (dS)_matter in Non-Redundant and Natural Variables Definition of Heat Valid for Open Systems. Although I am working in an improved version (currently twice that size) with a better definition of heat for open systems.

    In general thermodynamic entropy S=-k_b Tr{rho ln rho} can be though as a ratio of energy per temperature. For a closed simple system at equilibrium S=U/T

    Dear Steve,

    Sure, entropy isn't disorder ... But what is it, at all ? 

    Well, there is indeed a nice answer to the question, please check this link:

    Respectfully yours,

    Evgeni B. Starikov