Banner
    Guest Post: Jacques Distler, Why I Lost $750 On New Physics At The LHC
    By Tommaso Dorigo | June 27th 2013 05:01 PM | 10 comments | Print | E-mail | Track Comments
    About Tommaso

    I am an experimental particle physicist working with the CMS experiment at CERN. In my spare time I play chess, abuse the piano, and aim my dobson...

    View Tommaso's Profile
    Jacques Distler is a Professor of Physics at the University of Texas at Austin, and a distinguished theorist, as well as a physics blogger. Along with experimentalist Gordon Watts (who covered $250) he took my $1000 bet that the LHC would not discover new physics in its first 10/fb of proton-proton collision data. I discussed my take on the bet in a previous post; here Jacques explains his point of view, why he took the bet, and what he thinks of the present situation with new physics searches at the high-energy frontier.
    The article below has appeared today at Distler's blog, and I reproduce it here with his permission.


                                                                        * * *

    It’s been 20 years since I had the surreal experience of turning on C-Span late at night to see my future boss, Steve Weinberg, testify before Congress on behalf of the SSC.

    Steve, alas, was unsuccessful; the SSC was cancelled, and the High Energy Physics community threw our collective eggs in the basket of the LHC. The SSC, at sqrt(s)=40TeV, was designed as a discovery machine for TeV-scale physics. The LHC, with a design energy of sqrt(s)=14TeV, is the best one could do, using the existing LEP tunnel. It was guaranteed to discover the Higgs. But for new physics, one would have to be somewhat lucky.

    14 TeV sounds like more than enough energy, to hunt for new particles with masses of a few TeV. But that appearance is deceptive. The protons circulating in a hadron collider are like sacks of marbles, and each marble (“parton”, if you want to sound sophisticated) carries only a fraction of the total kinetic energy of the proton. At the energies we are talking about, the collisions are actually parton-parton collisions. So it’s the energy of the pair of partons undergoing the actual collision that matters.  And that energy is typically far less than the nominal sqrt(s). In fact, things are slightly worse than the metaphor implies. Each sack contains a variable number of marbles, and the mean number of marbles (sharing, between them, the total kinetic energy of the proton) increases with increasing sqrt(s).

    The upshot is that, at a hadron collider, the “interesting” collisions — the ones where, by chance, the colliding partons happen to carry a large-enough fraction of the proton’s total energy — are few and far between. To some extent, you can compensate for their rarity by increasing the total number of collisions (running the machine at higher luminosity). That introduces its own difficulties, but it’s the  tradeoff that the designers of the LHC needed to make.

    Still, there are (or were) lots of scenarios with new physics, accessible to the LHC. And theorists, being perennial optimists, put a lot of effort into exploring those scenarios. Moreover, I think we’d have to go back to Isabelle to find an example of an accelerator which opened up a new range of energies and didn’t find anything new. So, back in 2006, when Tommaso Dorigo proposed a bet, I was willing to take the position that the LHC would discover new physics.

    I didn’t, however, like Tommaso’s original terms (a new particle discovery, announced before the end of 2010).

    Experience with previous machines, like the Tevatron, is that startup dates tend to slip, and that it can often take years to ramp up to the full design luminosity. As it turns out, the LHC had barely begun to collect data by then, and the very first trickle of physics results started coming out in October of 2010. So I had wisely insisted that, rather than fixing a date, we agree on a fixed amount of data collected (10fb^−1), plus a suitable period (12 months) for the analyses to be done.

    Moreover (for reasons that I will recall, below), I thought the “new particle” criterion too narrow, and substituted “a 5σ discrepancy with the Standard Model.”

    Those terms seemed pretty solid to me, and I agreed to put $750 behind them.

    One thing which I didn’t count on was the 2008 quench incident, which led to the aforementioned delay in starting up the LHC and (more important for the bet, at hand), to its operation at about half of the design energy (sqrt(s)=7–8 TeV) up through 2013.

    Historically, the ramp-up in energy tends to be much easier and (since it drastically improves the “reach” for new physics) tends to be accomplished much more quickly than the ramp-up in luminosity. So I fully expected that most of that first 10 fb^−1 would be collected at sqrt(s)=14TeV. Alas, none of it was (and, foolish me for not insisting on a provision about sqrt(s) of the data).

    What about the “new particle” criterion?

    There are lots of scenarios where you would see a stark deviation from SM expectations at the LHC, but still be unable to ascribe that deviation to a new particle of a particular mass, etc. For example, much excitement was generated by the initial measurements of the H→γγ branching ratio, which were higher than the SM prediction by 2–3σ. With more data, that discrepancy seems to have gone away, but imagine if it had persisted. We would now find ourselves with a 5σ deviation from the SM — clear indication of the existence of new heavy charged particle(s) which couple strongly to the Higgs. But, since they only contribute to H→γγ via a loop, we would have almost no handle on their mass or other quantum numbers.

    Well, it’s been a little over a year since we reached the 10fb^−1 mark. The Lepton-Photon Conference seemed like a natural end-point for the wager. If there had been a discovery to announce, that would have been the natural venue.

    Needless to say, there were no big announcements at the Lepton-Photon Conference. And, since the LHC is shut down for an upgrade until 2015, there won’t be any forthcoming. So Tommaso is $750 richer.

    Would the outcome (aside from being delayed for another ~3 years) have been any different had I been smart enough to add a stipulation about sqrt(s)? Put differently, would I be willing to bet on the 2015 LHC run uncovering new BSM physics?

    The answer, I think, is: not unless you were willing to give me some substantial odds (at least 5–1; if I think about it, maybe even higher).

    Knowing the mass of the Higgs (∼125GeV) rules out huge swaths of BSM ideas. Seeing absolutely nothing in the 7 and 8 TeV data (not even the sort of 2–3σ deviations that, while not sufficient to claim a “discovery,” might at least serve as tantalizing hints of things to come) disfavours even more.

    The probability (in my Bayesian estimation) that the LHC will discover BSM physics has gone from fairly likely (as witnessed by my previous willingness to take even-odds) to rather unlikely. N.B.: that’s not quite the same thing as saying that there’s no BSM physics at these energies; rather that, if it’s there, the LHC won’t be able to see it (at least, not without accumulating many years worth of data).

    Ironically, a better bet for discovering new physics in this energy range might be on an ILC, running as a precision Higgs factory. I’ll leave it to you to calculate the odds that such a machine gets built.

    Rereading the comments on Tommaso’s post (and other things he’s written), you might well think this discussion is a proxy for a narrower one, about the status of supersymmetry. The 7– and 8–TeV runs at the LHC have, indeed, been very unkind to the MSSM. But they have been even more unkind to other BSM ideas. So


    • While the probability that the LHC will see any BSM physics (supersymmetric or not) has plunged dramatically,
    • the conditional probability that if the LHC were to see BSM physics, then said new physics would turn out to be supersymmetry, has gone up.


    That may be of little immediate consolation (and not an obviously-exploitable vehicle for making back some of the money I lost), but it is motivation for my experimental colleagues to spend the next couple of years thinking about how to optimize their searches to tease the maximum amount of information out of the post-upgrade LHC data.


    Comments

    Each sack contains a variable number of marbles, and the mean number of marbles (sharing, between them, the total kinetic energy of the proton) increases with increasing sqrt(s).

    Doesn't that mean that an observer in an accelerated reference frame travelling with the proton and an observer at rest will disagree on the number of partons? That surprises me a little, I would have naively guessed it to be invariant..

    Analogous to the Unruh effect, perhaps? Since they cancel out in the end there shouldn't be any contradictions. I wonder what happens with Hawking radiation observed to be stronger by one observer than another. Presumably some effect causes the black hole to evaporate at the same time for both anyway.

    The Unruh effect seems to apply to accelerating frames of reference, the comment reads to me like this is a property of different unaccelerated frames of reference, and that just the velocity matters -- although clearly a proton going around a ring in the LHC is accelerated as well....

    The partons are in principle there, what actually happens is the particle interacting with the proton can probe deeper the more momentum it has. If you think of a photon interacting with a proton at low momentum, it won't even be able to see inside the proton. Crank up the photon's momentum and it can probe a little further to be able to see three quarks. Even more and it can resolve the virtual "sea" quarks created by gluons splitting to quark/anti-quark pairs. If you don't mind a little hand waving, the photon can resolve things as big as size = hbar/momentum (from the Heisenberg Uncertainty principle), so the higher the momentum, the more the probe will see.

    Your opinion, this your opinion

    Supersymmetry is an excellent idea for making the running couplings equal (the current definition of force unification) as the Planck scale energy is approached. Likewise it's clear that "particles" must be spatially extended somehow, like Planck-length sized extradimensional "string", to explain the UV cutoff at that energy which is required to avoid vacuum polarized pairs of virtuals fermions gaining infinite momentum, which is the problem for singularities like the classical idea of a point particle with no UV cutoff.

    What's problemmatic is that for supersymmetry we are first doubling the number of particles in the universe, then we are breaking the "symmetry" by making all the unobserved "supersymmetric" particles (sparticles) heavier than the observed particles by some speculative mechanism which can't even predict precisely what the sparticle masses will be. Then we call this speculative symmetry breaking mechanism sold with a speculative, unobserved symmetry, "supersymmetry", which sounds like an enthusiastic sales pitch for a frankly unproved idea. The result for the MSSM is an increase from the SM's 19 parameters to 125 (mainly unknown) parameters. This is a high price to pay (analogous maybe to epicycles) for the "beauty" of getting three lines to converge on a plot of the fundamental force running couplings, which doesn't include gravity. While supersymmetry imposes 6 extra spatial dimensions, supergravity imposes 7, so then we need M-theory, with 10-d supersymmetry a brane on 11-d supergravity.

    It's a neat idea to add compactified extra dimensions to a metric to get extra fields, extending the basic Kaluza-Klein idea for unifying Maxwell's equations with general relativity. What worries crazy people like yours truly is whether the foundations are mathematically "correct". Even Jacques's boss Steven Weinberg expressed doubts about general relativity (during the "bad period" for string theory circa 1973, when hadronic strings were dead having been beaten up by Weinberg's SM QFT):

    “At one time it was even hoped that the rest of physics could be brought into a geometric formulation, but this hope has met with disappointment, and the geometric interpretation of the theory of gravitation has dwindled to a mere analogy, which lingers in our language in terms like ‘metric’, ‘affine connection’, and ‘curvature’, but is not otherwise very useful. The important thing is to be able to make predictions about the images on the astronomer’s photographic plates, frequencies of spectral lines, and so on, and it simply doesn’t matter whether we ascribe these predictions to the physical effect of a gravitational field on the motion of planets and photons or to a curvature of space and time.”

    - S. Weinberg, “Gravitation and Cosmology” Wiley, 1972, page 147.

    Weinberg was expressing doubts about the deep mathematical validity of general relativity. The stress-energy tensor T_ab is a classical differential representation which doesn't deal with particle singularities, so continuous (not discrete) sources for mass-energy, pressure, etc. must be used to that a smooth spacetime curvature in the Ricci tensor R_ab, results. There's no doubt that the energy conservation in the Einstein field equation is correct and it is well-tested on large scales (classical tests of general relativity), but it does not model dark energy predictively (the cosmological constant put in is ad hoc, to fit observations), and curved spacetime is just one mathematical description. I.e., Maxwell's equations describe fields as spatial field lines which can curl or diverge in space, but are not spacetime curvature. So, it could be argued that some of the difficulties in unifying SM forces with gravity are the two different descriptions of force fields: curve spacetime vs. field lines, i.e. Maxwell's equations extended to Yang-Mills field equations by adding the Lie product.

    The point is, should physicists be trying to go beyond the SM at the present time? It's convenient set up loads of scaffolding and starting to building on foundations if we want to cover up any cracks in those foundations. Eventually everyone left in the subject builds on the same foundations, and if there is a fundamental problem, there's no simple way to correct it cheaply. Apart from Weinberg's comments on spacetime geometry, there are very critical statements by Feynman on the axiomatical basis of QM in his 1985 book QED (which are not included in his earlier 1965 book with Albert Hibbs, which I've also read). According to Peat's biography, Feynman changed views in the 70s after long discussions with Bohm.

    Bohm in 1952 (Phy. Rev. v85, p166) put Dirac's amplitude Psi = R exp(iS) into Schrodinger's equation, obtaining his "hidden variable" pilot wave potential which didn't go anywhere. Feynman pointed out that there isn't a single Dirac wavefunction, there's actually one for every potential path, so Bohm's idea is plain wrong, but so is the Schroedinger equation which falsely assumes an electron has a single wavefunction! Feynman claims in QED (1985) that we don't need to axiomatically assume an uncertainty principle because it's just a 1st quantization artifact and is replacable entirely by multipath-interference if you dump Schroedinger's single wavefunction amplitude equation and instead use use path integrals. I.e., a separate Psi wavefunction amplitude for each virtual particle in the vacuum to deflect an orbital electron.

    As Jacques kindly pointed out to me on his blog, because most particles (apart from a spin-0 Higgs boson) have polarizations, you can't simply replace wavefunction amplitude exp(iS) with its real component cos(iS) and thereby eliminate complex Hilbert space (which has a problem with Haag's theorem for proving renormalization self-consistent), although looking at the diagrams in Feynman's book QED, it's clear than this can be done for simple scalar wavefunction amplitude calculations where spin polarization is unimportant like the double slit experiment or a single electron in orbit. Spin polarization IS important of course when two particles with similar or opposite spins collide, because that affects the S-matrix and affects the cross-section. However, Feynman's idea is appealing, because you could potentially get rid of all the mathematical problems by using simple physical modelling of particle exchange processes as a duality to the integration of lagrangians based on field equations.

    John Duffield
    Re: "The point is, should physicists be trying to go beyond the SM at the present time? It's convenient set up loads of scaffolding and starting to building on foundations if we want to cover up any cracks in those foundations". No. They should be addressing the foundations, and looking at things like Geometry of Electromagnetic Systems by Baldomir and Hammond. In a nutshell, electromagnetism is curved space, virtual photons are virtual, the electron is a harmonic standing wave, mass is the flip side of momentum, the Higgs field is a "relativistic aether", curved spacetime is inhomogeneous space, that kind of thing. IMHO getting all this right is "within the standard model" work that will demonstrate why much "beyond the standard model" work has been castles in the air.  
    Supersymmetry remains an attractive idea regardless of LHC data, because it contains a lot of interesting ideas, interesting mathematics, and it's still the only way to make SM running couplings "unify" with similar values as the Planck scale energy is approached. Likewise for supergravity if the gravity coupling is to equal that of the other interactions at the Planck energy.

    But how credible is it that perhaps couplings don't become equal at the Planck scale? A purely "SM + gravity" final theory, using some new way to relate the Higgs mechanism to a gauge theory of gravity, possibly modifying the electromagnetic long range force law to include the similarly long range gravity force and dark energy, would have half of the particles and just 15% of the parameters of the MSSM.

    The credibility of such ugly ideas may increase steadily, if people manage to find ways to develop such ideas. The asymmetry in Maxwell's equations due to the lack of magnetic monopoles observed in nature (div.B = 0, as contrasted to Gauss's law for div.E) does set a precedent for a mathematical ugliness in real world physics. Such ugliness may also exist in the sense of the running couplings not converging at the Planck scale in the SM, no?

    I'm betting that the real progress is in unifying the asymmetry in Maxwell's equations - i.e. no magnetic monopoles - to the left-handedness of the weak interaction. The handedness of the direction magnetic fields curl around the direction of propagation for a moving charge may be analogous to the left handedness of weak interactions, providing a new way to make some progress in better understanding the mixing angles (Weinberg angle and CKM angles) of the electroweak theory and the way that gravitational "charge" (mass/energy) is coupled to the Higgs field to explain particle masses.

    John Duffield
    IMHO when you get a handle on the foundations, some of those ugly ideas start to look even uglier. For example the electron has an electromagnetic field. It doesn't actually have an electric field, so it isn't actually an "electric monopole". E and B denote the linear and rotational forces that result in electromagnetic field interactions, they aren't fields in their own right. The field is Fuv, and it has a "screw" nature, rather like the frame-dragging in gravitomagnetism, but chiral. A magnetic monopole would be a region of space rotating freely like a roller bearing, and space just isn't like that. I'm not sure there's any link between magnetic field handedness and the left-handedness of the weak interaction, but I do think the real progress will come out of electromagnetism. And wave harmonics wherein the Higgs field is responsible for photon momentum as well as electron mass.
     
    I'd better back that up I suppose. Have a read of the watt balance section of the wikipedia kilogram article, wherein the plan is to define the kilogram using h and c and not much else. You know about the Coulomb constant and how Planck length is l=√(ћG/c³). Replace √(ћG) with 4πn where n is a suitable value. You've still got the Planck length. Now set n to 1, and work out 4πn/√(c³). There’s a bit of a binding energy adjustment to make related to the g-factor, but not much. Skip from wavelength to frequency to energy to mass by the usual route.  Now work out √(c)/3πm where m is a dimensionality conversion factor with a value of 1 that leaves you with a straight ratio. A mass ratio. Get your calculator out.
    Thanks John: I'm 100% behind your statement that the electron's field is electromagnetic, and I think I was first to demonstrate in a published paper (Electronics World, vol. 108, August 2002, pp. 46-49) how electromagnetic energy trapped in an oscillating small packet like a string theory particle core gives an apparent electric monopole, and a magnetic dipole, because of the way that the magnetic field partly self-cancels, precisely the magnetic dipole observed for an electron, and predicted by Dirac's spinor. Vacuum polarization makes the electric field run in the observed way around this stringy "core".

    I'm glad you agree that progress will come from electromagnetism. Abelian U(1) Maxwell equations seem only an approximation to the underlying deeper SU(2) symmetry of electromagnetism, because you have SU(2) pauli spin matrices and Maxwell himself in 1861 suggested a mechanism for magnetic fields which is chiral left-handed helicity or spin of field quanta (his "idler wheels"), which shares the same Yang-Mills symmetry group as the weak interaction. This is covered-up in nature by the fact that charged photons can't propagate in a one-way path due to infinite self-inductance (a magnetic field effect). Convention assumes that electromagnetism "must be" an Abelian (Maxwell type) interaction, where the extra 2 special polarizations of the 4-polarization field quanta needed to cause attraction of unlike poles/charges and repulsion of like poles/charges are not electric field. But charged field quanta do will have a magnetic field curling around the direction they go, which:

    (a) correctly predicts why an SU(2) Yang Mills electromagnetic theory reduces to look like U(1) Abelian electrodynamics mathematically. The charge exchange-equilibrium needed to overcome the self-inductance barrier, e.g. electron A sends the same flux of negatively charge to electron B per second as electron B is sending back to electron A, effectively cancels out the Lie product (the net charge transfer term) of the Yang Mills equations, reducing the Yang Mills equations for electromagnetism back to simply the Maxwell field tensor! You can't have any assymmetry in the exchange of charged field quanta because unless equal charge flows in each direction, the magnetic field curls won't cancel. This sounds complex but it is simply the mechanism why each conductor in a transmission line has equal currents in opposite directions so the magnetic field curls oppose and infinite self inductance is eliminated in order to have a propagating logic signal. (The magnetic field loop from the current in one conductor must cancel that of the current in the other conductor, so you can't simply discharge a battery by connecting a wire to just one terminal!).

    (b) The Yang-Mills SU(2) electromagnetic theory has 3 massless photons: the real (uncharged) one, one positive and one negative. This explains attraction and repulsion simply. E.g., two electrons exchange negatively charged field quanta under SU(2) Yang Mills electrodynamics, causing them to be pushed apart from recoil and impact! This is permitted since the magnetic fields of similar charges moving in opposite directions oppose each other, cancelling infinite self-inductance (see my 2002 paper, or more recent Nov 2011 vixra paper.) But if you get an electron and proton, the charges are opposite so they cannot exchange field quanta with opposite electric charges, since the magnetic field curls loop in the same direction and don't cancel, i.e. infinite self-inductance bans the exchange of charged SU(2) field quanta between opposite charges, so opposite charges are simply pushed together by vacuum field quanta (e.g. the Casimir force mechanism). Thus, this SU(2) electrodynamics explains physically why similar charges repel and unlike charges attract, in addition with simplifying the electroweak theory. Nobody has ever seen the core of any electron or other particle because you'd need >10^16 GeV, so the fact that electrons have a negative electric field around them doesn't prove their core is a rotating electric monopole, which is an outdated classical model anyhow.