What is a photon jet? Despite their exotic name, photon jets are a well studied thing nowadays. The original studies were performed by experimentalists who aimed to test quantum chromodynamics: they used to spend their time discriminating prompt photon production in hadron collisions from backgrounds. I remember a lot of such studies were performed in the 80ies and 90ies by my CDF colleagues, especially within the "QCD working group".
The importance of the detection of single, isolated photons of high energy has risen enormously since then, given their role in the discovery of the Higgs boson. Photon jets are in fact the background to beat down if you want a neat peak of H --> γγ decays to pop out of a mass histogram constructed from events featuring two photon candidates.

The fact is, the photon is not the most straightforward thing to detect in our apparata, as it leaves no trace of its passage through our tracking devices, being electrically neutral; it manifests itself as a shower of secondary electron-positron pairs once it hits heavy material in the calorimeters.

The shower process, if you are curious, is caused by the photon converting into an electron-positron pair in the Coulomb field of a heavy nucleus; the electron and positron, carrying half of the originating photon energy, then produce other photons of still lower energy by brehmsstrahlung in the vicinity of other nuclei, and the process continues until there is no more energy to spend for further splittings.

Because of the lack of a neat charged track and to the detection process, which basically looks at the collective properties of the electromagnetic shower of secondaries in the calorimeter, distinguishing a single energetic photon - a "prompt photon" - from a bunch of collimated photons which may be produced when a quark or gluon produces only neutral pions or eta mesons, and those then each decay to two collimated photons, is no easy feat. One is bound to consider the "fine structure" of the shower profiles, trying to see whether these are more similar to what one expects from a single incident photon or from two or more photons travelling almost parallel.

The news is that ATLAS recently inverted the usual paradigm "yesterday's signal is today's background", going for what was yesterday's background as a tool to find the possible signal of new physics in their data of today. They took their 2015 and 2016 collisions and selected pairs of "photon jets" which could be originated by several collimated photons - a perfect signature of neutral pion or η-dominated hadronic jets - and proceeded to study the invariant mass spectrum of those pairs of objects.

The idea is that some new physics scenarios include new heavy particles X that might decay into two light mesons, call them "a". The a mesons could then decay into two photons, or to three neutral pions. In the first case one would end up with two jets each made up by two collimated photons; in the second case one would have six photons in each jet (as the neutral pion decays to a photon pair itself).

The result of the ATLAS search is null: they do not find any anomaly in the mass distributions of these events. They thus use the data to set upper limits on the rate at which the hypothetical X particle could be produced in LHC proton-proton collisions. The analysis of the data is complex and I do not think I need to describe it here, but the search for a bump in the smooth background spectrum is a topic that we have discussed in this blog several times. ATLAS in fact uses the same technology that allowed the Higgs discovery in the diphoton decay mode, as well as the evidence of the infamous 750 GeV diphoton peak in 2015, which later proved to be a mere fluctuation. The relevant mass spectra of the present analysis are shown below (the data are divided in two different categories depending on a variable characterizing the shape of the calorimeter deposits, enhancing the sensitivity to different signatures).

As you know, I like to be picky sometimes with the way data are displayed in scientific plots. In this case, if I had to move a criticism it would be on the quoting of a "significance" from the difference of data and chosen background model in each bin. I doubt one cannot speak of a significance - at most I suppose it could be a "pseudo-significance", as it is likely computed without accounting for all the nuisance parameters affecting the agreement of data and model; but still calling the departures that way is a fallacious characterization, for many reasons. 

For example, look at all those empty bins in these histograms, on the right of the distributions. Do they really represent something meaningful once you subtract the background from the data, as is done in the lower panels? E.g. I read positive "significances" in the left panel at masses of 1700, 1800, 1880 GeV or so; what about all the intermediate bins, do they have "0 significance" because of having zero observed events? This is contradictory - the predicted rate is positive, but the significance has not gone below 0 in those bins (as e.g. happens for the bins around 1500 GeV). 

Also, the very notion that you can assign a "significance" to a bin departure from a model is deceiving in my opinion, although perfectly well defined mathematically (significance is a 1-to-1 function of the tail probability). I.e., for sure the Poisson tail probability of counts in a bin from a Poisson mean does not qualify as a meaningful metric of a signal being present, given that a signal would spread over many different bins. In that sense, only plotting the "significance" when you have >=0 events in the graphs is deceiving.

Anyway, diatriba mode off. One very cool graph that ATLAS produced for this analysis is the one below. It is a "temperature plot" where the significance of possible upward fluctuations of the data over backgrounds (and this time I am more keen to accept the definition as a sound one, given that the number plotted comes from a full-fledged fit) is described by the color of points in the plane of the unknown masses of X and a particles. The a particle mass is not reconstructed directly, but its value has an impact on the selection efficiency of photon jets and their discrimination from the single direct photon background (yesterday's signal!).

So, in the end we acquire some further knowledge about the lack of new physics in this particular manifestation. But we also interstingly acquire a new tool in the investigation of the unknown - photon jets as a signature are indeed stuff to explore more closely in the future!


Tommaso Dorigo is an experimental particle physicist who works for the INFN at the University of Padova, and collaborates with the CMS experiment at the CERN LHC. He coordinates the European network AMVA4NewPhysics as well as research in accelerator-based physics for INFN-Padova, and is an editor of the journal Reviews in Physics. In 2016 Dorigo published the book “Anomaly! Collider physics and the quest for new phenomena at Fermilab”. You can get a copy of the book on Amazon.