The only thing I think I can discuss with you here now is the predictions on the Higgs boson significance level produced by CMS in October 2010 - a couple of geological eras ago, that is. Those predictions can be trusted because 2011 data showed to be perfectly in line with them, both for the 95% CL limits and for the significance -of course the former are valid in the full mass range and provide more verification power than the single significance number, which is only valid if the Higgs boson exists and has a particular mass.

So the 2010 predictions were produced with 10 inverse femtobarns in mind and 7 TeV running. Now let me remind you that 10/fb is exactly what CMS has available to analyze for Summer 2012 conferences, and that in 2012 we have actually ran at 8 TeV, where the Higgs production rate is higher and signal to noise ratios are slightly better. So a sensitivity plot made with 10/fb at 7 TeV is a slightly pessimistic view. Still, let us see what is the prediction for a 125 GeV Higgs boson, below.

You should be looking at the black curve, at the point where the horizontal axis corresponds to 125 GeV: that is between the second and third vertical line after the 100 GeV one (this is a logarithmic x-axis plot, so it is not completely straightforward to read it). If you look carefully, you will get the following predictions:

- a >3-sigma excess in the gamma-gamma mode

- a 2.5 sigma excess in the ZZ -> 4 leptons mode

- smaller excesses in the WW decay mode and other channels

- a combined significance just short of 5 sigma.

Now, the above are median values of wide distributions. To explain what that means, imagine the histogram of heights of sixth graders in a school. The distribution will be wide, with a peak at about five feet. That is the mode of the distribution. The median, instead, is the value above which there are half of the kids (and below which there's the other half). In the plot above, each curve was computed by looking, for each mass hypothesis m, at a distribution of possible experimental outcomes (given the presence of a Higgs boson at that mass value); the median is computed, and plot, for each mass and for each curve. So while the black curve could be telling you "CMS will have a 4.7 sigma significance with 10/fb at 7 TeV", what you should read is "the median significance will be 4.7 sigma". You will not be able to know, from the figure, how wide is the distribution -not any more than you will be able to tell how many kids are 5'6" or taller in the sixth graders example. The distributions of possible significances is wide, and if the experiment is unlucky the significance may be much lower than the median; or if it is lucky, it may well exceed five sigma.

I hope this helps you putting in the right context the information that is circulating in the web these days, together with the predictions above.

## Comments

Ravi, it would translate into a taller sixth grader.

Could that sixth grader be taller because he is a mutant alien from outer space? Sure, but it is not enough to rule out pure random chance. The difference between a 3.5

[…so, apparently wordpress barfs when I type a greek character…as I was saying…]

Ravi, it would translate into a taller sixth grader.

Could that sixth grader be taller because he is a mutant alien from outer space? Sure, but it is not enough to rule out pure random chance. The difference between a 3.5 sigma prediction and 4 sigma result for gamma gamma alone is certainly too small, don't you think?

Thanks Peter. Yeah I guess we shd just subtract the two right ? I was also more asking for general info purposes -- for example if it is 4 sigma observed and 3 sigma expected then the difference is 1 sigma for 1 detector with both 2011 and 2012 data. It could be a 1.4 sigma excess for new physics if both detectors show similar excesses at same mass in same channel -- and these are combined in quadrature.This is the way to look at it?

Ha! There's the tri-modal probability distribution in view at last. Now we face an interminable argument about particle metaphysics and dimensionality. The hypothesis operationalizes as a tree, and the branches distribute tri-modally. Arguably that is evidence for a variable dynamic at loop level, a vortex of sorts, which rather undermines the original hypothesis.....

Hi Tomaso,

ZZ-->4l prediction has a centroid at different mass than gamma-gamma?

This is the only line, which is different?!

What could be the reason for that?

Best regards,

Nick

The plot above shows projected sensitivities as a function of Higgs mass.

Each point on the horizontal axis should be understood to be a different,

alternative hypothesis - namely, that the Higgs has a mass corresponding

to that point. The "shape" of those sensitivity curves depends on several

things, but it is meaningless to compare those for different channels.

The only thing you should compare is the significance level (the vertical

height of the curves) at a given mass point (i.e. along a vertical axis).

If that is understood, the reason for the different sensitivities at different

masses is the varying probability that the Higgs decays to each different

final state. The h->γγ decay, for instance, dies out for M>130 GeV; the

H->ZZ branching fraction has a funny shape with two local maxima at 140

and 200 GeV. Acceptances and backgrounds also influence these curves

non trivially.

Cheers,

T.

The combined projected significance peaks at around 160 GeV, at a value of 9.5 sigma. What could be causing this large event count, other than the Higgs? What again is the reason why this value for the Higgs mass is being ruled out? I'm sorry if these questions have been asked before.

The combined projected significance is not an actual observation. It is what CMS would see if the Higgs boson had a particular mass value. For MH=160 GeV, indeed CMS would see a very large event count, which nothing else could produce. But... Nothing like that was observed, because in fact the Higgs does not have that mass.

The plot, in other words, is only saying what would be seen IF there were a Higgs boson; and it is a combination of many different mutually exclusive hypotheses - if the Higgs has a mass of 150 GeV, of course it does not have a mass of 160; and so on.

Cheers,

T.

Tommaso,

just for curiosity: could the 125 GeV signal be due to a Regge trajectory state of a neutral meson? (at least on some channels?)

I began by writing the following email to America’s Discover magazine (it was about an article of theirs concerning Julian Barbour). Unintentionally, my email started talking about a subject which fascinates me – the Higgs boson/field (I’ve been thinking about this for years, and I spent hours deciding on the best words to use in a short email). I used Albert Einstein’s theories to come to the conclusion that what we call the Higgs is our name for ALL particles (not simply this one or that one) being composed of quantum mechanical "wave packets" formed by the union of gravitons and photons – the notion of the Higgs actually being all particles implies that its possible discovery by the Large Hadron Collider would be another experimental verification of the existence of quantum entanglement in time and space and on Earth. In turn, gravitons and photons – along with all time and space - are composed of electronic binary digits (this may be termed the Higgs field).* I suspect this idea of binary digits composing space-time is highly unfashionable in the present worldview of quantum fluctuation. Also, people believe in strictly linear time where effects do not influence causes, but the “binary digits” idea requires a looping subroutine where electronics from the future is transmitted nearly 15 billion years into the past in order to create the subuniverse we currently inhabit (on a separate note, I believe we live in an infinite universe made up of subuniverses shaped like figure-8 Klein bottles that are made flexible enough to seamlessly – except for wormholes – fit into each other by their construction from binary digits). Dark matter could be explained as matter travelling from future to past, or past to future, which is invisible but still has gravitational effects. Dark energy could be explained as gravity or space-time (i.e. the product of binary digits) being programmed to accelerate and expand (I prefer to regard acceleration/expansion being the result of more space-time continually being created, which is what the Big Bang’s rival – Steady State theory – proposes). Anyway, the unfashionableness of my ideas does not automatically make them wrong.

* The University of Edinburgh scientist Peter Higgs pointed out that the Higgs field would produce its own quantum particle (the Higgs boson) if hit hard enough, by the right amount of energy. The Higgs field is the name given to the unification of space-time by the binary digits creating it. Therefore, the Higgs boson would necessarily indicate this unification and “…its possible discovery by the Large Hadron Collider would be another experimental verification of the existence of quantum entanglement in time and space and on Earth.” Why does data from the LHC “… see tantalising hints consistent with making Higgs bosons with a mass of around 125 times as heavy as the proton?” (http://www.ph.ed.ac.uk/higgs/laypersons-guide) I don’t know why there are hints at this specific mass. I can only suggest that we use quantum physics’ wave-particle duality and think of all the subatomic particles in the universe – and throughout all time – as a beam of light from a torch. If the circle of light cast by the torch represents all subatomic particles, then the centre of that circle (which is its brightest part) represents the masses’ energy of 125 billion electron volts (125 times as heavy as a proton).

Here’s the email I sent to Discover –

I'd like to comment on the article "Is Einstein's Greatest Work All Wrong—Because He Didn't Go Far Enough?" by Zeeya Merali (March 2012 issue).

"Long before Einstein, (Austrian physicist and philosopher Ernst) Mach had advocated a ‘truly relative’ theory, in which objects were positioned only in relation to other tangible objects—Earth relative to sun, pub relative to farmhouse—and not against any abstract background grid." (“Is Einstein’s Greatest Work …”)

This makes sense as long as we assume that space-time is an unverifiable abstract grid and matter, such as objects, is the only component of reality.

"When forced to summarize the general theory of relativity in one sentence, Einstein said: time and space and gravitation have no separate existence from matter." - PHYSICS: ALBERT EINSTEIN’S THEORY OF RELATIVITY at http://www.spaceandmotion.com

Einstein's thinking claims that space-time is as much a part of reality as matter is, and his thinking can potentially be verified by the Large Hadron Collider. This is because the Higgs boson/field sought by the LHC could turn out to be a non-Standard-Model Higgs where subatomic particles are composed of quantum mechanical "wave packets" formed by the union of gravitation's gravitons. To give matter a different appearance from gravity, this union could include electromagnetism's photons. The amplitude of gravity waves might taper from a central point to the sides while the amplitude of electromagnetic waves remains constant - in which case electromagnetism would be modified gravitation and Einstein would have been correct when he said gravitation and electromagnetism may be related.

Since the great physicist claimed gravitation is the warping of space-time, time and space would have no separate existence from matter and would be the ultimate composition of the non-Standard-Model Higgs particle. Continuing from Einstein's deductions, space-time cannot simply be an abstract background but must be composed of something, or else it could not give rise to the matter we see, touch, and probe with instruments. But that something also gives rise to immaterial space, time, and gravity. What could be the source of things we see, and also of things we do not see? Why not the electronic binary digits of 1 and 0? After all, we can view a webpage but can never view its ultimate composition.

So Julian Barbour’s approach is only good for people who only believe in what they can see. Albert Einstein’s approach is the one to follow if we ever hope to achieve a Unified Field Theory or Theory of Everything which has meaning in physics, as opposed to purely in mathematics. A mathematically defined unified field could be accurate and detailed, but it would only be relevant to mathematicians and would therefore be somewhat abstract. A physical unified field would be relevant to everybody, enabling us to understand and manipulate both what we can and can’t see in the universe.

Higgs Boson / “God Particle” -2012 Science validates a 150+ year old discovery ……............Infinite Intelligence….Steve Meyer / New Thought Movement / HolisticDNA

The Sixth Sense Activation Sequence – GROUNDBREAKING New Book in 2012!

“New Thought promotes the ideas that “Infinite Intelligence” or “God” is ubiquitous, spirit is the totality of real things, true human selfhood is divine, divine thought is a force for good, sickness originates in the mind, and “right thinking” has a healing effect..." Wikipedia

If we assume a standard model Higgs at 125 GeV and make the plots we should expect no deviation from that in case standard model with 125 GeV is good. This way we can also find if there is any deviation for the gamma-gamma and other modes that may point to new physics beyond the standard model. For example in Peter's blog a 4 sigma excess is rumored in gamma gamma mode (assuming no Higgs at 125 GeV). This would translate to some deviation from the standard model assuming a Higgs at 125 GeV? What would be the sigma of that? Can we infer it also from the kind of plots you put or because they refer to the medians this information is lost?