Banner
    ATLAS Higgs Results: One Or Two Higgs ?
    By Tommaso Dorigo | December 14th 2012 03:54 AM | 21 comments | Print | E-mail | Track Comments
    About Tommaso

    I am an experimental particle physicist working with the CMS experiment at CERN. In my spare time I play chess, abuse the piano, and aim my dobson...

    View Tommaso's Profile
    UPDATE: for more on this, I only now realize that my friend at Resonaances had written about it yesterday... It is nice to see that he agrees with my conclusions, anyway. Also Peter has news on it, and as usual additional links...

    ---

    My ATLAS colleagues will have to pardon me for the slightly sensationalistic title of this article, but indeed the question is one which many inside and outside CERN are asking themselves upon looking into the new public material of ATLAS Higgs boson searches using the 2011 dataset in conjunction with the first part of 2012 data.

    My two pence: relax. This is normal business - if we had to get excited at every slight disagreement between our measurements and our expectations, we'd be sick with Priapism (sorry ladies for this gender-specific pun).

    The fact is, we love to speculate. We'd love if we ended up discovering that what we have announced last July as THE Higgs boson were in fact two, or more, distinct states. It would be like having discovered a vein of diamonds while digging in a gold mine. So observing that the Higgs peak in the diphoton final state of ATLAS sits over two GeV away from the peak of the H->ZZ signal is music to our ears. Yet we need to keep our feet on the ground.

    So first of all let me give you the basic facts.

    ATLAS has a very nice signal of H->γγ in a total of 17.8/fb of collision data. You can check it out in the figure on the right, which shows the combined spectrum of all their diphoton signal categories on top, and the background-shape-subtracted data on the bottom part, showing a very clear bump at 126.5 GeV. You can make no mistake here: that is a new resonance for sure.

    They use the signal for a mass-cross section measurement, and they obtain the result shown by the figure below: the one- and two-sigma contours show that the mass is in the whereabouts of 126.5 GeV. The cross section of these final states measured by ATLAS is slightly higher than the standard model calculation, sitting at μ=1.8 (1.0 is the standard model prediction). That the ATLAS signal is stronger than expected is no news of course; the compatibility with the standard model rate is at the level of 5%, so I am not impressed  by this.



    (If you need more detail on what the graph shows: the full line in black shows the full measurement and the one-sigma contour around the best-fit values of mass and signal strength; the black dashes show the 95% CL contour. The blue and red companion ovals show what happens if one were to not consider in the fit the systematic uncertainties due to energy scale and other systematics.)

    BONUS TRACK: an added side note on the greedy bump bias

    I should note that there is an interesting effect to consider when fitting a small signal on top of a large background, if you do not know the true signal mass. Any fitter will try to maximize the signal, in order to make best use of the signal shape parameters, by "searching" for fluctuations right or left of the main peak, adapting the mass parameter accordingly. By doing that, the rate measurement is potentially upward biased.

    I have studied the topic a long time ago in the context of investigations of a weird bump in the dimuon mass distribution that CDF used to have at 7.2 GeV, and produced an internal CDF note on the subject. Later, three years ago, I produced two detailed posts in this site on the matter. What I found is that in general, the bias in signal strength is a universal function of the ratio between signal and square root of background under the signal. This could be guessed from first principles, but what is interesting is that one can look back at the plots I produced with pseudoexperiments a dozen years ago, and guesstimate the size of the potential rate bias in the H->gamma gamma fit.

    The matter is complicated by the fact that these H->γγ results are actually the combination of eight different channels, each having a different value of signal to noise. Nevertheless I tried some eyeballing of the relevant distributions. It appears that the s/sqrt(b) values in the eight ATLAS categories are not too different from each other (as far as is relevant to our purpose here): in a five-bin window spanning the signal region I observe s/sqrt(b) values ranging from 0.7 to 1.5, give or take a cow or two [No, the last sentence cannot possibly make sense to you if you are not Italian].

    Anyway, for similar situations my toy studies indicated biases of the order of 10% in the signal strength extracted by the fits. I wonder if ATLAS corrects for this bias or not - I know that the bias has been recognized (although they now call it in a different way from what I dubbed it originally), so they might well be doing it already.

    Back to business

    Now let us instead turn to the H->ZZ->four lepton decay mode. Here, the signal to noise ratio is much larger (indeed the four-lepton final state is dubbed "golden" decay mode for Higgs searches), but the number of Higgs boson candidates you can find in any dataset is smaller, because although the branching fraction of Higgs bosons to ZZ* pairs for a 125-GeV Higgs is one order of magnitude larger than the branching fraction to gamma pairs, in the former case you have to multiply that by the branching fraction of Z bosons to electron or muon pairs, each of which is a mere 3.3%! So the golden mode is really rare, and this reflects in the paucity of signal events in the  graph on the right. A clear signal is visible here too, however (it is evidenced by the blue histogram (expected Higgs contribution for Higgs mass equal to 125 GeV).

    And now let us see (left) what the one- and two-sigma contours of the profile likelihood fit to the above signal produces, for mass M and signal strength μ (the latter on the vertical axis, is the signal rate in units of the expected standard model signal as usual). Here the signal strength is more in agreement with the standard model prediction, but the best-fit mass (123.5 GeV) is significantly lower than the fit 126.5 GeV of the gamma-gamma mode! What is going on ?

    Indeed, what is going on is the question everybody is asking. However, consider. This effect is only seen by ATLAS - the latest published results by CMS show consistency between the mass measurements in the diphoton and di-Z final states; and CMS has a quite similar sensitivity to Higgs bosons. CMS sees signals in the 125.5 GeV ballpark, which is right in the middle of the "two" ATLAS signals. Indeed, have a look at the two ATLAS mass measurements compared one to the other, in the figure below.


    And maybe most interesting is the figure below, which shows the two mass measurements one against the other, with a diagonal line acting as "hypothesis zero" (that the two objects are indeed one and the same, such that the mass is the same and thus x=y): you can see that the two mass measurements are discrepant at just a bit over 2-standard deviations level.



    So, to answer the question one idea is of course that some miscalibration systematics are affecting either or both mass measurements in ATLAS. However, I am sure this has been beaten down to death by the experimenters before making public the present results.

    Another idea is that the gamma-gamma signal contains some unexpected background which somehow shifts the best-fit mass to higher values, also contributing to the anomalously high signal rate. However, this also does not hold much water - if you look at the various mass histograms produced by ATLAS (there is a bunch here) you do not see anything striking as suspicious in the background distributions.

    Then there is the possibility of a statistical fluctuation. I think this is the most likely explanation, and I am willing to bet $100 with as many as five takers that the two measurements will be reconciled with each other once more statistics is added, and that no observation of a double state will be made. This however might take three years to sort out, given the impending shutdown of the LHC.

    Finally, you might instead want to believe that we are indeed looking at the first hint of new physics -Supersymmetry or some other model producing multiple Higgs-like particles. Very exciting, but I just do not buy that.

    Time will tell! So if you have some extra cash to throw away consider taking the bet...







    Comments

    Dear Tommaso,

    Thanks for this post and the graph detailing the background distributions, something I found somewhat missing in the original analysis in yesterday's posting by Jester. However, his main topic of interest for now seems to be more the signal strength rather than the "twin peaks" as such.
    This however got me thinking, again, that it is very strange for the ATLAS collaboration to consistently produce results that are either puzzling or surprisingly lucky, while the CMS collaboration is the one to always produce reliable data having already done your homework. Is it a matter of competition, do they try to be the first to publish, even if the result is as inconclusive as this? After all, we saw what happened to the OPERA experiment once the "flaw" in the experiment was found.
    However, as you are clearly not allowed to tell us what CMS is seeing in the 2p channel, and you have written nothing about it here, I would simply assume you have some interesting effect to analyse, otherwise why the wait?
    But certainly, I would not take you on that bet! It's just common sense...

    Ahem, you are either misinformed or cherry picking. Have a look at the CMS Higgs to tautau search, which between summer (ICHEP 2012) and fall (HCP 2012), went from finding less than nothing, to find a gentle excess compatible both with sm higgs and nothing, and this by retracting a sophisticated multivariate analysis to go back to a more traditional cut based. If you do not find this puzzling....

    dorigo
    Hi Angel,

    I believe ATLAS sat on these twin peaks for quite a while before deciding to publish them. The policy appears to be one which has been the subject of discussions, but which ultimately is considered good science by the majority of scientists involved in the analyses. You do your best to produce a physics result. If once you produce it you find it puzzling in some way, you analyze everything further a bit more, to verify that there are no bugs or unforeseen effects. Eventually you publish it, with or without any additional bug fixes you had to add.

    Note that withholding 2-sigma flukes a little is considered by some a biasing technique: you end up publishing more quickly (and without an overload of checks) the results that do agree with your expectation, while you go slower with ones that are on the tails of the distributions. However, what is most important in our measurements is the mean squared error, which is the sum of squared bias and variance. If, by giving further scrutiny to discrepant results, you end up producing a tiny bias in your ensemble of measurements, on the other hand you are investing your time most effectively on those results that are more likely to be in need of some fix. This reduces the overall variance, and you are doing the right thing in the long run - minimizing as you are the mean squared error.

    I have no insider knowledge of how much did ATLAS discuss their policy in such cases, however what I have described is probably a good approximation of their thinking. I myself have been pondering on the issue, and I believe that indeed this is a sound way to decide on one's publications.

    Cheers,
    T.
    Hank
    I don't often comment, because I always feel like I get a little smarter but not that I have anything meaningful to contribute. But I can't contain it; this is probably the most fun article I will read this month.
    Regarding the "greedy bump bias":

    First, this bias is a property of the point estimator of the Higgs signal strength, which should be kept logically separate from the 1-sigma or 2-sigma intervals on the signal strength. The maximum likelihood estimator is "asymptotically unbiased", which means that with enough data no problem. What is the measure of "enough data"? Well, it's basically the significance or your observation of s/sqrt(b) scaling of the bias. So if you have an insignificant peak it is strongly biased, but if you have a 5-sigma peak the effect is pretty small.

    We do not correct for this bias b/c we give a confidence interval on the signal strength. That interval covers. So when we say give the 68% or 95% interval the signal strength, that doesn't need any sort of correction.

    Deviating from maximum likelihood estimators would be a can of worms. For instance, creating a good confidence interval for some sort of bias-corrected estimator would probably be ugly business.

    There's no need to apologize. Priapus notwithstanding, priapism is a gender-neutral term for delayed return to flaccidity. The penis and clitoris are analogous organs (the visible part of the clitoris being the analogue of the glans penis), both composed largely of erectile tissue, and both subject to this painful condition. I believe the threshold for medical emergency is four hours in either case.

    One certainly *HOPES* the ATLAS group, and other groups at CERN, have checked for loose fiber optic cables and such. After all, we don't want any more superluminal Higgs bosons showing up.

    ATLAS has 21.3fb of 8TeV data (assuming 92% recording ratio) so we'll know by spring time if this is getting 'interesting' or not.

    I'm not going to bet, but just to clarify: Are you taking bets on whether it's an error vs statistical fluctuation, not just new physics vs SM only?

    Tommaso,

    Have you got your 5 takers? If not, please sign me up!

    3 GeV is a big spread, and I'm pulling for the surprise and mystery. Exciting stuff!

    :) RSVP!

    dorigo
    Dear Jeff,

    if you are interested in a serious bet, please send me by email your real name. If you are not in HEP I need somebody in HEP who is known who can guarantee you'll pay the bet if you lose it! Then we can discuss the details of the stipulation.

    Best,
    T.
    T.,

    I'm not in HEP, but am truly excited to put $100 down vs yours that 2 Higgs have been found, and that the data is NOT a statistical fluctuation! :)

    Could I impose and ask you to drop me an email at: jeff@thecoleas.com to work out logistics?

    RSVP, Jeff

    Tommaso,

    As far as I know there is nothing explaining the presence of the Higgs field. There was no need for it since there has been no proof of its existence until recently. What if it is the result of yet another broken symmetry? Does it not imply that there would be multiple Higgs particles?

    I am a lay-person. So if you would care to answer my interrogations I would be very happy, no matter what.

    Andre

    What you say is not true. There are many theoretical arguments in favour of the Higgs field.
    This does not mean that it MUST exist, but there are many arguments suggesting it.
    First, it is a simple way to give mass to gauge bosons (or "force carriers" if you like), otherwise
    they are predicted to be massless, like the photon in QED.
    Second example is that without the Higgs, elementary processes like WW scattering will diverge
    with increasing energy. The Higgs regulates and make this process finite.
    Again, these are theoretical arguments, and no proof-of-existance, which only an experiment can give.
    The WW scattering in particular is slightly away from the standard model prediction and we are still
    not sure something else is hiding in the LHC data...
    We have to wait and see, but the possibility that the standard model is correct or the new physics hides
    to very high energies, beyond the LHC one is still there....

    Hello Anonymous2,

    I am not arguing against the Higgs field. I am simply puzzled by the possibility of two Higgs! So let me rephrase: Is it possible that the Higgs field is the product of a broken symmetry? And if so, would it not imply multiple Higgs particles?

    Andre

    Energy has 2 forms expressing time and velocity, which are additive and multiplicative or scalar and proper respectively; which is basically ratios of the same motion energy manipulated in various ways:

    Scalar with time + velocity = D resulting in quadratic and parabolic forms. Here the correct value of curvature of pi finds it form in relation to phi, phi being velocity; where Phi = 16 / pi^2, so velocity is coming from Euler’s equations where E^(i*pi) = -1 and -1 is time x velocity, pi acting from outside the universe is generating 2 waves, or one form of consciousness, appearing as part of a group of constants is producing the universe. The parabolic forms of scalar products define different inertial fields or systems considered multiple. Proper or multiplicative are single, as in E=mc^2 or D^5=mc^2. In the universe, then everything can be described with phi or velocity and its complement or conjugate time.

    Proper dimensions are in products of v and t and also in powers, such as D^2. Scalar evolves in 2d, 3d, etc. And D is middle C, the resonance of the universe representing those 2 wave forms. Double middle C and you get the 528 frequency that the vedic philosophers can the sound that heals. In Vedic philosophy, 2 wave forms also produce all the chakras, which are a relation of the scalar wave forms from d to 2D, 3D, etc.

    In physics, the 2 pi begins to refer to geometric dimensions, not to be mistaken with dimensions of powers, so 4 pi is 2 dimensions, 8 pi is three, and 16 pi is 4 dimensional, not to be mistaken with a property product of D^4 or force.

    Presently I am working on a gravity equation that related proper Newtonian with the quadratic elastic form of scalar systems, gravity has both as a single systems, an also as an elastic system of the aether and its superposition. So we can speak of the total energy of a system as proper, acing instantaneously, and also as an elastic vibration at a quantum level, being equivalent.

    Mass is induced by velocity and time. The association with a particle does not reveal the mechanism by which properties come to be. The induction of mass is from multiple causes. When you want to understand the real nature of something, you need to ascribe the "functions" that bring it into being. Particle physics is merely looking at the effects of these functions without determining the underlying functions. The particle is merely an associated effect of multiple functions, a node within the process of wave function interactions. Mass is a function product of smaller functions, hence the particle does not really represent this as it reveals the transition between various functions. Looking for higgs is like hoping for a function or set of functions that produce a particle, but particle and function are not the same. Since the functions that produce products are time and velocity, then the higgs or particle associated with producing anything would have to be at the smallest scale. At a scale of D^2, we can see the field matrix of 2 dimensions associated with perhaps a photon. Since T^2 = density, then even in 2 dimensions there is a relationship to mass and motion.

    When you consider density = t^2, anything with time is related to mass and so with regards to the higgs, anything in time is related to mass, and mass is therefore induced by velocity and time functions. particles are residuals or interferences of wave functions, so the idea of discovering a function that causes mass is velocity and time, function resulting in function, and the particle related to such would be functions without mass or less than mass, since mass is merely a product of functions time and velocity, and those being at the smallest level, hence the smallest particle contributes by association to smallest functions product, mass. So even if you associate a particle with the function mass, what causes or induces mass are functions smaller where mass is merely a product of such. And at such a scale our instruments could not define. We are looking for mass, but need to look for the decay of mass into velocity and time, which a representing node would be a photon or less.

    It’s not correct to say that a wave function mass is a particle; however a particle may exhibit such properties as a product of wave interactions by association.
    If we consider force to be D^4 = ma, then a force carrier by association would be related to a particle at D^4. If we consider the higgs boson to be a force carrier, then its decay into two photons would be square root of D^4 or D^2. Hence the particle higgs by association to the wave field is 4 dimensional while the photon is 2 dimensional. So then the ether is a wave field of 2 dimensional photons at when excited by a wave front act as a three dimension excitation.

    The question of decay would determine whether there is more than a single mechanism for mass. Mass, in TDUFT would have multiple origins and the functions that create mass are V and T. When those functions V and T become products, at greater complexity states of mass are presented or induced, so rather than saying a certain configuration achieves mass, there may be multiple, and so the higgs may be an "a" rather than "the".

    Thomas N Lockyer has written an article of the vector components of a photon. Translated into TDUFT, they are D^2 (charge) , V or ampere per meter (which is DV/D), and E/H or DT (component of time). Notice in the creation of every field is a component dominant D, a V, and a T, and these representing a photon have a 2 dimensional vector or wave construct at D^2 associated with the particle.

    [www.vectorparticlephysics.com]

    so then any situation that resolves in mass= D^3T^2 or greater such as D^4 with 2 photons can be attributed to resulting in mass.

    The Universe is a product of wave functions velocity and time. Together there are three variables D, T and V are wave functions whose products result in all properties. Particles are merely the interference nodes of these wave matrices.

    It is interesting that the higgs boson has a channel that leads to 2 photons. If you take tduft and analyse lockleer’s work, those photons have a vector which is D^2, which when 2 products become force or D^4, are equivalent to a model for force carrier. So the standard model does not focus on function. Mass is a function, and so is time and velocity. So then to say, the standard model will revolve issues of function when it relates particles, is not an exact approach. And also note, that something equivalent to force or D^4 is greater than mass which is D^3T^2, and so the higgs is not exactly mass, it is mass with acceleration, and so the experiment is flawed. And the decay channels never lead us to D^3T^2. What is more interesting is that even at T^2, there is a relationship to mass, since T^2 = density; so anything in time has mass. Do not be mislead, we know now that a neutrino has mass, and so do all particles if they exist in time.

    And in an event, there are 2 systems at work. The proper multiplicative, which would refer to Newtonian, such as force = Gmm/D^2; as a single system, and the equivalent elastic, additive, scalar system, which is f = mv^2 + mv^2/D.

    All physical equations can be derived from three units and two forms which are T + v = D and T x V = -D, which are represented in phihat x phi = -1 and phihat + phi = 1. phi is representing velocity as all numbers are. 2 is represented by v + v/v; relating proper and scalar functions in whole numbers. If you take Einstein’s equations and convert to three units the cosmological constant is 2 and fine structure is 2 pi. Interesting! Everything is a ratio of motion. And particles are the interference pattern in the fields, which quantum physicists refer to as excitation. There is aether. On the short end is Schwarzschild radius; constants merely set the scale, and further out is D^5=mc^2. Note the -1 in Euler’s equation represents time x velocity, the 2 waves that generate everything. actually it is one energy transformed.

    fundamentally
    I am not astonished by the fact that they find new particles. I am astonished by the fact that they so easily call them Higgs particles.

    There is reason to belief that the diversity of elementary particles has something to do with the discrete symmetry set of the quaternions. Quaternion number systems are 1+3D and therefore they exist in 16 discrete symmetry forms. If you combine them into pairs, then there are 256 possibilities. If you ignore the real parts, still 64 possibilities are left. Generations have nothing to do with discrete symmetry sets. So, even when anti-particles are included and of if you accept color charge  as  a dimension related characteristic, then the standard model has filled only 36 of the 64 possibilities. 

    (For a more detailed vision on discrete symmetry sets, see http://www.scitech.nl/English/Science/OnTheHierarchyOfObjects.pdf)

    If you think, think twice
    It could be a sign of fourth particle generation, which manifest itself with excess of events in diphoton channel too.

    Tommaso, you said:
    "… ATLAS has a very nice signal of Higgs to digamma in a total of 17.8/fb …
    showing a very clear bump at 126.5 GeV
    …[ and]…
    best-fit mass … 123.5 GeV …[for]… the H to Z to four lepton decay mode

    CMS sees signals in the 125.5 GeV ballpark …".

    Andrea Albert at The Fermi Symposium 11/2/2012 said:
    "… gamma rays detectable by Fermi Large Area Telescope [Fermi LAT} …
    line-like feature localized in the galactic center at 130 GeV …
    Reprocessing shifts feature from 130 GeV to 135 GeV

    Earth limb is a bright gamma-ray source … From cosmic-ray inteactions in the atmosphere …
    Line-like feature … at 135 GeV … Appears when LAT is pointing at the Limb …".

    In all these observations, there is only ONE PEAK between 120 GeV and 140 GeV:

    ATLAS Z4l - 123.5
    CMS - 125.56 GeV
    ATLAS digamma - 126.5
    FermiLAT galactic center - 130 GeV
    FermiLAT reprocessed galactic center - 135 GeV
    FermiLAT EarthLimb - 135 GeV

    Since all 6 peaks are SINGLE PEAKS in the 120 to 140 GeV range
    and
    since
    reprocessing shifts in the FermiLAT peaks are at least on the order of 5 GeV
    and
    since
    within ATLAS there is at least a spread between peaks on the order of at least 2 GeV
    and
    since there has not yet been any extensive calibration work to reconcile
    experimental setup and analysis process differences
    between LHC and FermiLAT

    I think that it may be likely that all of these observations
    are of the same SM Higgs state between 120 and 140 GeV.

    Some more details are at
    vixra.org/pdf/1212.0083v2.pdf

    Tommaso,
    would you be open to a bet that all 6 peaks, both at LHC and FermiLAT,
    turn out to be the same state of the SM Higgs between 120 and 140 GeV,
    or
    do you agree with me that they are probably all the same SM Higgs state
    and that further data and analysis will show that to be the case ?

    Tony

    dorigo
    Dear all,

    sorry for leaving comments unattended here for long. The fact is I am playing a chess tournament and have no internet connection at home because of a move, so I am a bit disconnected for a while. Will be back at full speed next week.

    Cheers,
    Tommaso