After decades of theoretical studies and experimental measurements, forty years ago particle physicists managed to construct a very successful theory, one which describes with great accuracy the dynamics of subnuclear particles. This theory is now universally known as the Standard Model of particle physics. Since then, physicists have invested enormous efforts in the attempt of breaking it down.

It is not a contradiction: our understanding of the physical world progresses as we construct a progressively more refined mathematical representation of reality. Often this is done by adding more detail to an existing framework, but in some cases a complete overhaul is needed. And we appear to be in that situation with the Standard Model. 

Why the Standard Model cannot be the final theory

The reasons why an overhaul of our theory of subnuclear physics is needed are multi-fold. First of all, the Standard Model does not incorporate the gravitational interaction in the framework. This could be thought to be just a detail -after all, gravitational forces acting on elementary particles are teeny-tiny in comparison with all others; however, regardless of the fact that we do not know yet how to model gravitation at the quantum level,  an extension of the Standard Model to include that force seems not possible: one must rewrite things from scratch.

Apart from the neglection of one of the four fundamental interactions, the Standard Model presents us with a surprising paradox if we accept it as a fundamental theory: the mass of the Higgs boson, the particle we found at the LHC in 2012, appears unnaturally small. A dozen unrelated quantum effects contribute to determine the value of that parameter of the theory, and they all are orders of magnitude larger than the final result. The paradox can be explained with an analogy. There are several possibilities, so let me pick the one below today.

Suppose that sum of the profits and losses of ten companies turn out to total  $10. If you only know this fact, and that a priori profits and losses in any given year appear to be more or less equally likely in the highly competitive market where these companies operate, what kinds of gross revenues do you expect the average company to have? Knowing only these facts, you might expect for a string of lemonade stands in a busy residential neighborhood. But you would be shocked if you learned that the companies had gross revenues of tens of billions of dollars a year each, and the total profit nets out to $10 simply by chance, even if profits and losses are equally likely. You would suspect that someone had carefully combed through tens of thousands of corporate reports to come up with a combination that was so equally balanced on purpose. Physicists, similarly, suspect that the quantum contributions to the Higgs boson mass balance so well due to some hidden structure that we have not yet discovered, based upon the unnaturalness of the near perfect balancing in the absence of some unknown non-random principle that causes them to do so.


The paradox above, and the failure of the Standard Model to be "gravitation-ready", make us believe that that very beautiful theoretical construction is not the final word. Plus, we tend to accuse the model of a sin it needs not be guilty of: in fact, we have a feeling that a complete theory of subnuclear interactions should incorporate an explanation of dark matter in the Universe.

We know dark matter exists, and we believe that a as-of-yet-undiscovered neutral massive particle could explain it beautifully - but this implies that the model requires a modification.

How to prove it is not it

There are two ways to break the Standard Model: a direct, and an indirect one. The direct way consists in finding some new particle or new force that is unknown to the model. The Large Hadron Collider experiments are trying very hard in that diretion, so far with no luck. And then, there is a more subtle, indirect way: compare the Standard Model prediction of some observable quantity with its actual measurement, looking for discrepancies. This is being pursued both at the LHC and in other particle physics facilities around the world: to find a discrepancy you do not need to look at the highest possible collision energy, as the anomalous effect could be one driven by low-energy reactions.

There are theoretical reasons to believe that it is in the physics of B hadrons -particles containing a bottom quark- that subtle deviations from the Standard Model predictions can be best spotted. That is one reason why several experiments are studying those particles in great detail. LHCb is one such experiment; another is Belle. While LHCb detects B hadrons in the same proton-proton collisions that are studied by CMS and ATLAS, Belle looks at B hadrons that are produced in much softer electron-positron collisions. The advantage of electron-positron collisions is that these are extremely clean; the disadvantage is that the production rate of B hadrons is orders of magnitude smaller there.

Measurements of P5'

Some time ago, the LHCb collaborators produced an intriguing result. They studied decays of B hadrons that produced a kaon plus a pair of charged leptons - electrons or muons - and they found that some of the angular variables that can describe the configuration of the decay products looked different from Standard Model predictions. In particular one of them, called P5', was seriously at odds with the model, at the level of over 3 standard deviations.

Belle has now repeated the measurement performed by LHCb, and confirms the previous result. You can see below the Belle measurement points (in black) compared to LHCb ones in a graph which shows the value of P5' as a function of the invariant mass squared of the lepton pair. The Standard Model predictions are shown by brown bars, and are only available in three of the four investigated bins. Also note that there are two regions of the horizontal axis where the measurement cannot be done, as the two leptons in the corresponding mass range "resonate" at the J/Psi and Psi(2S) mass, making the observation of the non-resonant decay mode impossible.




All in all, I am not too impressed by this result: it confirms a previous measurement, and the discrepancy with the theoretical calculation; however, I do not thing the situation has changed from before: there is a partial mismatch (in one bin) with a theoretical calculation which is difficult to carry out, and which might well be perfectible. If the Standard Model is going to crumble and fall down, I think it will not do so by being unable to properly describe this very detailed property of the decay of B hadrons. 

Or, at least, I would not like this to happen: I much prefer the option of finding unexpected new particles in the high-energy tail of mass distributions produced by ATLAS or CMS... So let's wait for the new measurements of the two-photon mass distribution in the 2016 LHC data!

The bump observed at 750 GeV might be real... If it were so, the next few years could be incredibly entertaining!