I am about to go to an informal workshop on naturalism and its implications, organized by cosmologist Sean Carroll. The list of participants is impressive, including Pat Churchland, Jerry Coyne, Richard Dawkins, Dan Dennett, Rebecca Goldstein, Alex Rosenberg, Don Ross and Steven Weinberg. You may have recognized at least four names of people with whom I often disagree, as well as two former guests of the Rationally Speaking podcast (not to mention Don Ross’ colleague, James Ladyman).

The list of topics to be covered during the discussions is also not for the faint of heart: free will, morality, meaning, purpose, epistemology, emergence, consciousness, evolution and determinism. Unholy crap! So I decided — in partial preparation for the workshop — to start a series of essays on emergence, a much misunderstood concept that will likely be at the center of debate during the gathering, particularly as it relates to its metaphysical quasi-opposite, determinism (with both having obvious implications for most of the other topics, including free will, morality, and consciousness).

It’s a huge topic, and the way I’m going to approach it is to present a series of commentaries on four interesting papers on emergence that have appeared over the course of the last several years in the philosophical (mostly philosophy of physics) literature. Keep in mind that — although I make no mystery of my sympathy for the idea of emergence as well as of my troubles with reductive physicalism — this is just as much a journey of intellectual discovery for me as it will be for most readers. (A good overview can be found, as usual, in the corresponding entry in the Stanford Encyclopedia of Philosophy.)

That said, let us begin with “Emergence, Singularities, and Symmetry Breaking,” by Robert W. Batterman of the University of Western Ontario and the University of Pittsburgh. The paper was published in Foundations of Physics (Vol 41, n. 6, pp. 1031-1050, 2011), but you can find a free downloadable copy of an earlier version here.

Batterman begins by putting things into context and providing a definition of emergent properties: “The idea being that a phenomenon is emergent if its behavior is not reducible to some sort of sum of the behaviors of its parts, if its behavior is not predictable given full knowledge of the behaviors of its parts, and if it is somehow new — most typically this is taken to mean that emergent phenomenon displays causal powers not displayed by any of its parts.” If you think that this amounts to invoking magic, you are not seriously engaging in this discussion, and you may as well save your time and quit reading now.

Batterman sets up his paper on an interesting premise: instead of looking at how philosophers of various stripes have conceptualized emergence and then examining possible cases from the sciences, he goes about it precisely the other way around: “I think it is better to turn the process on its head. We should look to physics and to ‘emergent relations’ between physical theories to get a better idea about what the nature of emergence really is.” This, the attentive reader might have noticed, is just about the same idea of practicing naturalistic (i.e., science-informed) metaphysics proposed by Ladyman and Ross in their Every Thing Must Go, already discussed on this blog and its accompanying podcast.

The second interesting twist in the paper is that Batterman avoids the usual treatment of emergence in terms of mereology, i.e., of parts vs whole. Instead, again bringing Ladyman and Ross to mind, he suggests that the most promising approach to understanding emergence is to look at the mathematical features that play an explanatory role [1] when theories are compared at different energy scales. From then on things become pretty complicated, but I’ll do my best, using extensive quotations from the paper itself.

Batterman takes on an alleged (as it turns out) case of reduction of a phenomenological to a more “fundamental” theory: the relationship between classical thermodynamics (phenomenological) and statistical mechanics (fundamental). The fact is, “the quantities and properties of state in orthodox thermodynamic equations appear largely to be independent of any specific claims about the ultimate constitution of the systems described,” which would seem to cast some doubts on the simple version of the reduction story. As Batterman puts it, “Reduction in this context typically is taken to mean that the laws of thermodynamics (the reduced theory) are derivable from and hence explained by the laws of statistical mechanics (the reducing theory) ... [but] there are very good reasons to deny that all thermodynamic (and hydrodynamic) phenomena are reducible to “fundamental” theory,” and these reasons have to do with phase transitions (solid and liquid, liquid and gas, etc.).

The crucial claim is that phase transitions are qualitative changes that cannot be reduced to fit the more fundamental explanatory principles of statistical mechanics. Phase transitions, therefore, count as genuine emergent phenomena.

The question, of course, is what — then — might explain the qualitative / emergent phenomena. Batterman does not go for the popular, easy but somewhat vacuous, solution of invoking “higher organizing principles.” Instead, he looks toward mathematical singularities in order to get the work done. Let me explain, as far as I understand this.

Batterman directs his reader to the role played by renormalization group theory within the context of condensed matter theory. This is because renormalization theory “provides an explanation for the remarkable similarity in behavior of ‘fluids’ of different molecular constitution when at their respective critical points.” It turns out, experimentally, that there is a universal pattern that describes the behavior of substances of very different micro-constitution, an observation that — suggests Batterman — would make it puzzling if the right explanation were to be found at the lower level of analysis, in terms of the specific micro-constitution of said fluids.

Batterman examines a typical temperature-pressure phase diagram for a generic fluid, and concludes that “Thermodynamically, the qualitative distinction between different states of matter is represented by a singularity in a function (the free energy) characterizing the system’s state.
Thus, mathematical singularities in the thermodynamic equations represent qualitative differences in the physical states of the fluid in the container.” The important part here is that the explanatory work is done by mathematical singularities, constructs of which scientists are often wary, indeed — according to Batterman — downright prejudiced against. But prejudice, it turns out, is a poor reason not to bite the bullet and embrace singularities.

Batterman continues: “The renormalization group explanation provides principled physical reasons (reasons grounded in the physics and mathematics of systems in the thermodynamic limit) for ignoring details about the microstructure of the constituents of the fluids. It is, in effect, an argument for why those details are irrelevant for the behavior of interest.” [Italics in the original] In terms of the necessary presence of singularities, Batterman acknowledges that a number of physicists and philosophers consider the appearance of singularities to be a failure of the physical model, but goes on to say: “On the contrary, I’m suggesting that an important lesson from the renormalization group successes is that we rethink the use of models in physics. If we include mathematical features as essential parts of physical modeling then we will see that blowups or singularities are often sources of information.” Singularities are our best friend, insofar as a mathematical understanding of physical processes is concerned.

At the end of the day, of course, both reduction and emergence are necessary, with the latter playing a particular role in the broader picture: “While one may be able to tell detailed (microstructurally dependent) stories about why individual fluids/magnets behave the way they do at criticality [i.e., at the point of phase transitions], such stories simply cannot account for the key property of the emergent protectorates [properties] — namely, their insensitivity to microscopics.”

For Battermann it is downright puzzling that one should always seek explanations at lower levels, even when it it clear that higher level phenomena of a particular class behave uniformly independently of their micro-scale makeup. He asks: “why should that individual derivation [for a particular fluid] have any bearing on a completely different individual derivation for a different fluid with a potentially radically different microstructural constitution?” Why indeed.

At one point Batterman turns the table on the reductionist, essentially accusing him — as I often find myself doing — of ignoring the data in the service of an unjustified a priori preference for a particular ontology: “One crucial and obvious feature of the world is that different things — particles, organisms, etc.— exist or appear at different scales. The world is not scale invariant with respect to space and time.” So why expect that “fundamental” scale-invariant theories should account for all of the world’s features?

Batterman also gets briefly into another example, pertinent to a theory that is arguably the most powerful currently accepted by the physics community, quantum field theory. Despite its spectacular successes, including the incredibly high degree of experimental confirmation, “the theory has been deemed by many to be foundationally suspect. Those who hold that a successful theory should yield predictions from-first-principles, as it were, independent of experimental/phenomenological input believe that it cannot be the final theory. ... more upsetting to many is the fact that quantum field theory when actually used for calculations and predictions typically engenders all kinds of divergences [i.e., mathematical singularities and infinities]. With these monsters ever present, it is claimed that there must be something wrong with the foundations of the theory.” Again, as in the case of phase transitions above, however, it is possible to actually see these “monsters” as the theoretician’s friends, given the amount of explanatory work they actually make possible.

So, the bottom line is that physicists should take singularities and infinities as important — and informative — features of their theories, not as “monsters” indicating underlying flaws in the theoretical architecture. This has happened before in the history of mathematic itself, when Goerg Cantor had a hard time explaining to his colleagues that infinities aren’t mathematically suspect, they are a crucial part of the story. Similarly, the kind of singularity that Batterman is talking about may turn out to be one of the loci of explanation for large classes of emergent properties, as well as a much more solid basis for studying emergence than generic appeals to somewhat mysterious higher organizing principles.

_____

[1] The very idea of mathematics explaining rather than simply quantifying / describing things may sound weird, though it appears to be accepted by many mathematicians and physicists, as well as by philosophers of both disciplines. See, of course, the RS entry on mathematical Platonism and links therein.

Originally appeared on Rationally Speaking October 11th.