Muons are very special particles. They are charged particles that obey the same physical laws and interaction phenomenology of electrons, but their 207 times heavier mass (105 MeV, versus the half MeV of electrons) makes them behave in an entirely different fashion.

For one thing, muons are not stable. As they weigh more than electrons, they may transform the excess weight into energy, undergoing a disintegration (muon decay) which produces an electron and two neutrinos. And since everything that is not prohibited is compulsory in the subnuclear world, this process happens with a half time of 2 microseconds.

The quoted time interval is an eternity for elementary particles, but still, muons do not exist in ordinary matter - if you want one, you need to create it with a particle collision of sufficient energy, or you may wait for it to rain down as a cosmic ray from the upper atmosphere, where proton and light nuclei hit nitrogen or oxygen atoms and create showers of particles, from which mainly muons emerge at sea level. But as often happens, I am divagating.

Muons are important in studies of fundamental physics because of two distinct reasons. The first is that they emerge as a signature in a number of interesting processes, including ones which could signal the existence of new physics beyond our currently accepted "standard model" of fundamental particles and interactions. Hence their identification and measurement in a particle collider experiment is a priority.

The second reason why muons are important is that they are very easy to identify: because they, as leptons, do not interact strongly with nuclear matter, and because of their much larger mass than electrons, muons may traverse large thick layers of material without being absorbed. This makes their experimental signature unequivocal, and all modern particle detectors for studies of collisions are indeed instrumented with dedicated muon detectors.

In order to work, a muon detector must just detect a charged particle and measure its trajectory: this is because these instruments are placed downstream of other instruments that absorb all other particles, measuring their energy from the chain reaction of interactions and radiation they undergo in dense materials. These latter instruments are called calorimeters.



The figure above shows how muons produce a very distinctive signature in the CMS detector, shown in a cut-away view with the interaction point to the left. It is by tracking the muon path in a magnetic field that muon energies can be measured in instruments like CMS: the path of all charged particles is bent when they traverse magnetic fields orthogonally to the field lines. But this effect becomes weaker and weaker as the particle energy increases: in fact, the inverse of momentum -the curvature of the trajectory- scales linearly with momentum, and this has nefarious consequence for the precise measurement of momentum: a simple calculation will show you that given some precision of your position measurement along the curved track, the resolution on the estimate of particle momentum (or inverse curvature) scales quadratically with momentum!

The above fact means that if you have a pretty good relative resolution (of the order of one or a few %) on the momentum (or energy, which is the same thing when we discuss GeV-muons and above) muons of 100 GeV, this may become 20% at 1 TeV, and 40% at 2 TeV, and so on. In fact, ATLAS e.g. quotes relative resolutions of 8 to 20% for 1-TeV muons.

As the recently updated strategy for European Particle Physics directs us to investigate new ambitious projects, like colliders of higher energy, we must reckon with the fact that we cannot create giant detectors with magnetic fields much larger than those of ATLAS and CMS. CMS stands for "Compact Muon Solenoid", and it was designed as Compact precisely to allow a strong magnetic field in its interior. A larger collider than LHC may not allow that luxury anymore, while producing particles of higher energy than those CMS and ATLAS measure. What to do?

Enter our study. With Jan Kieseler, Lukas Layer and Giles Strong, I considered an idealized design of a granular calorimeter made of lead tungstate cells, and simulated large numbers of muons in a range from 100 GeV to 2 TeV impinging on its front face. The result is something like the picture shown below.



Muons of so high energy do traverse the detector easily, but they do leave behind a trail of soft photon radiation. This is by no means similar to the electromagnetic shower of energetic electrons: the latter deposit _all_ their energy in the calorimeter, allowing a direct measurement of their energy; muons do not. And yet, is it possible to infer, from the precise pattern of those soft energy deposits, how much energy did the incoming muon have?

An ancillary, important question to the above one is the following: How much added value does a high granularity of the calorimeter cells offer, over an integrated measurement of the deposited energy by the muons? This is very important to understand, as the answer determines whether a single fat block of lead tungstate may do the job or not. But is there really informative value in the random pattern of those soft clusters of energy?

Yes, there is. And the article we published yesterday proves it. What we did was to use an algorithm I designed a long time ago (in 2003) for the regression of the mass of Higgs bosons from the measured energy deposits left by hadronic jets in the CDF detector. Back then the algorithm (called Hyperball, and based on a boosting of the nearest-neighbor method) allowed me to prove that CDF could in principle reach a 10% relative resolution on the Higgs boson mass, giving a lot of hope to the Tevatron for the just started Run 2. Unfortunately, CDF was not endowed of as performant a silicon tracker as it had designed, and this took a hit on the performance of b-tagging, reducing the potential of the detection of Higgs decay to b-quark jets, and making my study less relevant than it could have. But that's another story...

With the Hyperball algorithm in the past weeks we have performed a regression to a predicted muon energy from a set of 16 variables that summarize the pattern of deposited energy read out by the cells of the simulated granular calorimeter. I will spare the details of the algorithm here, but just mention that it leverages gradient boosting and ensemble of weak learners... If you need to know more, of course you are invited to read our article.

Here let me jump to the results. The "money plot" is the graph below, which shows the regressed muon energy (on the vertical axis) as a function of the true muon energy (on the horizontal axis) for the kNN regressor (in blue) and for a regressor that only uses the total deposited energy as the basis of the prediction (in red). What the lines represent are percentiles of the distribution of predicted energy along the vertical axis: the lower of the three blue curves is the 16th percentile, the middle is the 50th percentile, and the higher is the 84th percentile. The resulting band describes the resolution on muon energy achieved as a function of true energy.



It is clear from the graph above that the simple "energy sum" regression is incapable of measuring well the true muon energy. However, by adding spatial information to the mix, and proper statistical summaries of the pattern of deposited radiation, it is possible to achieve relative resolutions of 30% at the top of the investigated energy range.

The above observation has important consequences for the design of future instruments for a new hadron collider, or even for a high-energy lepton collider, as a calorimetric measurement of muon energies had not been proven to be feasible until now, at least in collider physics applications. For muons above 2 TeV, even a 4-tesla solenoid such as the powerful magnet of CMS starts to be challenged, and calorimetric determinations of the muon energy become an important, independent ingredient that may preserve the physics potential of the mentioned future machines.