At 125 GeV of mass, the Higgs boson is a very heavy particle; yet its natural width is predicted to be of just 4.15 MeV in the standard model, a value much smaller than that of particles of similar mass. The top quark, for instance, has a width of 1.5 GeV; and the Z boson has a width of 2.5 GeV: three orders of magnitude larger.

Natural width -the width of the resonance shape peaking at the rest mass of the particle- is a fundamental attribute of elementary particles, and arguably an even more important one than the mass itself. In fact the width determines the lifetime of the particle: particles that live longer have a smaller width because they have more time to "settle" to their nominal rest mass. In the case of the Higgs, if we found out that its width is substantially larger than what the standard model predicts, we would immediately know that there are possible decay modes of which we know nothing about yet: it would be a clear indication of new physics. The presence of extra ways for the particle to decay in fact ensures that the lifetime will be shorter, and the width larger.

4.15 MeV are a really small number as compared to the experimental resolution in the particle mass which we may achieve with the CMS or ATLAS experiment. Hence there is no chance to measure that parameter directly: any Higgs mass distribution experimentally determined at the LHC will have a observed width waaaay larger than the natural one. However, we can measure the Higgs natural width indirectly by looking at very off-shell Higgs bosons.

It has in fact been noted by several theorists that the peculiar production processes yielding Higgs bosons at the LHC modify significantly the observable Higgs boson lineshape. In other words, what we can detect at the LHC in a histogram of the Higgs mass is the convolution of the Lorenzian shape - a peak with a width equal to the natural width of the Higgs, centered at the Higgs mass - with the production cross section, which receives enhancements at high mass due to the large coupling of the Higgs boson with the heavy top quark. The result is that instead of quickly dying out, the expected signal mass distribution has a significant tail at very high masses.

It looks strange to think that a 125 GeV Higgs boson may yield a significant signal at masses of 300 GeV and more, but that is exactly what happens. What is most interesting, however, is that the strength of the signal there is strongly dependent on the natural Higgs width. Check out for instance the picture below, which shows the reconstructed mass of ZZ pairs by CMS. A Higgs boson with a width of 25 times the SM prediction (i.e. just a bit more than 100 MeV or so) would produce a very significant enhancement!

Above, the four-lepton mass distribution of CMS data collected in 2012 8-TeV pp collisions is compared to ZZ background (blue) and SM prediction (in beige), and with a Higgs with a natural width 25 times the SM prediction (dashed histogram).

By studying both the 4-lepton final state of ZZ events and the two charged leptons + 2 neutrino final state, CMS has managed to determine an upper limit on the Higgs width at 4.2 times the standard model value, at 95% confidence level. That is already cutting into several models that could predict much larger widths by hypothesizing the existence of unknown decays. Note that results are extracted for two cases: under the assumption that the global production rate is what the SM predicts, and under no assumption on the rate. A larger global production would enhance the tails, but production rates much larger than the SM prediction are however excluded by looking at the 125 GeV peak. The x4.2 times limit is extracted under no rate assumption.

A Precise Bound On The Higgs Boson Width

## Comments

If the possible width is 4.2 or 25 times the standard model width , then there are lot of non-standard model particle decays remaining to be seen. Is that right? Are people looking into such decay modes now?

kashyap vasavada (not verified) | 03/22/14 | 11:35 AM

- Link

Hello Kasjap,

the CMS result is an upper limit, so it says that the widith is smaller than 4.2 times the SM prediction. It could be 3 times, or twice, or just exactly as the SM predicts. In the future we

will be able to constrain that number further down.

In the meantime we are indeed looking at non-standard possible decay modes, but none has

been found so far.

cheers,

T.

the CMS result is an upper limit, so it says that the widith is smaller than 4.2 times the SM prediction. It could be 3 times, or twice, or just exactly as the SM predicts. In the future we

will be able to constrain that number further down.

In the meantime we are indeed looking at non-standard possible decay modes, but none has

been found so far.

cheers,

T.

Tommaso Dorigo | 03/22/14 | 16:10 PM

Hi T,

The error bars appear to be overestimated in the graph. Shouldn't one out of every three points be more than one sigma away from the expected standard model curve? All 8 points are within a sigma.

The error bars appear to be overestimated in the graph. Shouldn't one out of every three points be more than one sigma away from the expected standard model curve? All 8 points are within a sigma.

Anonym (not verified) | 03/22/14 | 22:29 PM

Hello,

those error bars are central intervals at 68%CL obtained according to the Garwood prescription, which is the one you get by inverting the hypothesis for a Poisson. They are correct. The only objectionable thing is the neglect of error bars on the zero-entry bins, which should extend to 1.8 events.

What you say is correct - on average 4 of those 13 data points in the bins should fail to cover the SM histogram. But that is only true on average - the fact they all cover is not such a striking observation.

Cheers,

T.

those error bars are central intervals at 68%CL obtained according to the Garwood prescription, which is the one you get by inverting the hypothesis for a Poisson. They are correct. The only objectionable thing is the neglect of error bars on the zero-entry bins, which should extend to 1.8 events.

What you say is correct - on average 4 of those 13 data points in the bins should fail to cover the SM histogram. But that is only true on average - the fact they all cover is not such a striking observation.

Cheers,

T.

Tommaso Dorigo | 03/23/14 | 05:26 AM

Thanks T. for your response. Not marking the zero-entry bins seems like a lapse. Shouldn't it routinely be marked? Probability of all 13 observations being within one sigma is approximately 0.68^13 ~ 0.6%. I guess LHC produces hundreds of such graphs that this occurs once in a while!

Anonym (not verified) | 03/24/14 | 02:47 AM

In fact, there's too many such plots...

Yes, the lack of bars on zero-entry bins is something I constantly try to remind people to do, but sometimes they just don't listen - some of my collaborators have aesthetic criteria that appear to get in the way.

By the way, one bin is indeed out -there is a 1.85 events prediction with '0 observed, which is just above the 68.3% coverage bar (if it were plotted). So the probability for such an (a-posteriori) observation is 0.0495.

Cheers,

T.

Yes, the lack of bars on zero-entry bins is something I constantly try to remind people to do, but sometimes they just don't listen - some of my collaborators have aesthetic criteria that appear to get in the way.

By the way, one bin is indeed out -there is a 1.85 events prediction with '0 observed, which is just above the 68.3% coverage bar (if it were plotted). So the probability for such an (a-posteriori) observation is 0.0495.

Cheers,

T.

Tommaso Dorigo | 03/24/14 | 04:06 AM

Some stupid, ignorant people will have difficulty in understanding how a "fundamental" particle, like a spin-0, 125 GeV Higgs boson, can "decay" at all. Then they will quibble about the meaning of the resonance width. They will say:

"The neutron decays because it is a composite particle, composed of three on-shell quarks, so that when beta decay occurs, one downquark changes into an upquark. In other words, there is a Feynman diagram for a mechanism of decay in radioactivity.

"When a muon or tauon decays, it is because it is a heavy "state" of an electro, which couples more strongly to the Higgs field, than the electron does. The standard model says that all fermions are inherently massless and only acquire their mass from spontaneous symmetry breaking, the Higgs field. Therefore, the muon and tauon both have a larger coupling to the Higgs field, than the electron does. This quantized coupling changes when the muon or tauon "decay", so again you have a simple mechanism for understanding the basis for decay.

"But the Higgs boson is an exception, like no other! What is the mechanism for the decay of a spin-0, 125 GeV Higgs boson? Is is not a composite particle like a neutron, instead it is a fundamental particle. And, unlike the muon or the tauon (or the quark flavors heavier than the upquark) Higgs bosons can't decay by changing their coupling to the Higgs field (i.e. mass). There is only one mass of Higgs boson, it is 125 GeV. The weak force-controlled decay of a Higgs boson into bosons with a total spin of 0, seems therefore to indicate that the Higgs boson is a composite, not a fundamental boson. It must be a Bose-Einstein condensate of spin-0, formed from a pair of particles.

"Surely the 4.15 MeV resonate width on the 125 GeV mass proves the Higgs is a composite particle. If it were not composite, its decay lifetime would be simply dependent on its mass by the simple Heisenberg uncertainty principle in its energy-time version: lifetime = {h-bar} / {energy, i.e. 125 GeV}. This very simple Heisenberg law governs the decay of muons and tauons, and heavy quark flavors. ("Half-life" = lifetime divided by the natural log of 2.)

"It is only when you have composite particles decaying, that you get resonate width influencing the decay rate."

"The neutron decays because it is a composite particle, composed of three on-shell quarks, so that when beta decay occurs, one downquark changes into an upquark. In other words, there is a Feynman diagram for a mechanism of decay in radioactivity.

"When a muon or tauon decays, it is because it is a heavy "state" of an electro, which couples more strongly to the Higgs field, than the electron does. The standard model says that all fermions are inherently massless and only acquire their mass from spontaneous symmetry breaking, the Higgs field. Therefore, the muon and tauon both have a larger coupling to the Higgs field, than the electron does. This quantized coupling changes when the muon or tauon "decay", so again you have a simple mechanism for understanding the basis for decay.

"But the Higgs boson is an exception, like no other! What is the mechanism for the decay of a spin-0, 125 GeV Higgs boson? Is is not a composite particle like a neutron, instead it is a fundamental particle. And, unlike the muon or the tauon (or the quark flavors heavier than the upquark) Higgs bosons can't decay by changing their coupling to the Higgs field (i.e. mass). There is only one mass of Higgs boson, it is 125 GeV. The weak force-controlled decay of a Higgs boson into bosons with a total spin of 0, seems therefore to indicate that the Higgs boson is a composite, not a fundamental boson. It must be a Bose-Einstein condensate of spin-0, formed from a pair of particles.

"Surely the 4.15 MeV resonate width on the 125 GeV mass proves the Higgs is a composite particle. If it were not composite, its decay lifetime would be simply dependent on its mass by the simple Heisenberg uncertainty principle in its energy-time version: lifetime = {h-bar} / {energy, i.e. 125 GeV}. This very simple Heisenberg law governs the decay of muons and tauons, and heavy quark flavors. ("Half-life" = lifetime divided by the natural log of 2.)

"It is only when you have composite particles decaying, that you get resonate width influencing the decay rate."

Nigel B. Cook (not verified) | 03/25/14 | 05:01 AM