A new paper by Davison Soper and Michael Spannowsky has been sent to the Cornell preprint ArXiv last week. It proposes a new technique to reconstruct the decay of heavy particles within hadronic jets, and shows how this can improve the sensitivity to heavy new particles by studying in particular the case of a heavy Z' boson decaying to boosted top quark pairs. I believe the technique is very interesting and I will try to give a few impressions of it here; before I do, let me introduce the topic for outsiders.
The search for heavy particle decays into hadronic final states is a fascinating topic and it is now almost 30 years old. Hadronic jets were first seen in the seventies when the energy of electron-positron collisions was pushed up to values which enabled for the first time to observe that the final state particles distributed unevenly, in structures elongated along a preferential axis. This was a direct evidence for the production of quark-antiquark pairs in the Drell-Yan process.
It took ten more years to start using those jets as a direct measurement of the quark or gluon emitted in the hard collision. The UA1 and UA2 experiments at the CERN SppS, the 546-630 GeV proton-antiproton collider where the W and Z bosons were discovered, used jets in the search for the top quark and to study quantum chromodynamics.
It was the UA2 collaboration who first showed how the invariant mass of jet pairs could be used to find a signal of the hadronic decay of W and Z bosons. The field of jet spectroscopy developed quickly afterwards, and was taken on by CDF and DZERO at the Fermilab Tevatron, where not only were jets used to search for new heavy resonances, but also to find a signal of the top quark in 1994/95. The fully hadronic decay signal of top quark pairs was seen by CDF in 1996 (by my group in Padova and my laurea thesis), and later on a signal of Z decays to b-quark pairs was observed by CDF in 1998 (my Ph.D. thesis).
More recently we have seen many searches for hadronically-decaying resonances flourish both at the Tevatron and at the LHC. The rush to discover the Higgs boson at these facilities included attempts at reconstructing the decay of the Higgs to pairs of b-quark jets; these signals are still at the level of three sigma but the new LHC data will put them at the observation level of five standard deviations and above.
In parallel with Higgs searches, the large energy of LHC collisions has opened a new paradigm as far as these jet spectroscopy are concerned. Boosted heavy objects, such as top quarks, can decay yielding "fat jets": the three quarks emitted in the hadronic decay of the top coalesce in a single stream of hadrons, which however retains the information by displaying a large total invariant mass and some substructure. It is exactly by exploiting that substructure that "top-tagging" techniques have been developed.
Of course, one does not study boosted top quarks for the sake of studying the top by itself: the interest lays in the possibility of detecting heavy objects producing e.g. pairs of boosted top quarks. A new Z' boson, for instance, could do that. Of course a Z' boson could also be spotted quite simply by looking for its decay to muon pairs; however there are several models that suggest that a Z' boson could be "leptophobic", and thus be prevented from decaying into lepton pairs. Such an object would most easily be observable by employing top tagging techinques.
At the LHC the technique to spot heavy particle decays within fat jets consists in cleaning up the detected energy deposits, identifying sub-jets whose combined invariant mass agrees with the hypothesized origin -for instance, pairs of sub-jets may be the result of a W or Z boson decay, or triplets may yield the mass of the top quark. These techniques are now quite advanced and powerful, but there is evidently still room for improvement.
So there comes Soper's and Spannowsky's idea. Their approach is similar to what is generically called "matrix element". They consider two hypotheses for what originated the pair of fat jets, a signal one and a background one. The signal hypothesis could be that the jets originate from the decay of a pair of top quarks, and the background one that the jets are just the result of hadronization of a pair of quarks or gluons. Soper and Spannowsky construct a likelihood ratio of the two hypotheses employing the probability to detect the configuration actually observed as due to each of the considered causes. In constructing these probabilities one needs to compute suitable "transfer functions" connecting a given final state object to an observed "micro-jet".
The details are probably not suited for this short summary, so I invite whoever is interested to give a look at the paper. Here I just limit myself to showing how the proposed technique can improve the sensitivity of a search for a 1500 GeV Z' decay to top quark pairs. Authors compare the discriminating power of their reconstruction method to the "HTT" top tagging algorithm which has been used in similar searches by the ATLAS collaboration: the comparison is shown in a graph of the background versus signal efficiency.
In the graph you see what is the inverse background efficiency as a function of signal efficiency of the Soper-Spannowsky method (in blue) compared to the HTT method (the points at Z' efficiency of 0.1), for a 1500 GeV Z' boson. The higher the inverse background efficiency, the more one is rejecting backgrounds, and so the better is the discriminating power. It is clear that the new method is a clear improvement in this particular case. [Don't be confused by the many curves, which refer to imagining the background only composed by SM top production (in red) or by QCD light jets (in black)].
Next year the Large Hadron Collider is scheduled to start colliding proton beams at the centre of mass energy of 13 TeV. This 62.5% increase in total collision energy allows us to become sensitive to heavy new particles in various ways, depending on their characteristics and production mechanisms. For sure, a new Z' boson has a chance to show up with just a few months worth of data. Hence I believe that Soper and Spannowsky's idea should be developed by ATLAS and CMS soon, in order to be ready for the new Run 2 data.
- PHYSICAL SCIENCES
- EARTH SCIENCES
- LIFE SCIENCES
- SOCIAL SCIENCES
Subscribe to the newsletter
Stay in touch with the scientific world!
Know Science And Want To Write?
- What Lies Beneath West Antarctica?
- Exodus 2100: Due To Climate Change
- Planet Nine: A World That Shouldn't Exist
- Professor Frenkel: Why Shouldn't We Drop Algebra From Our Education System?
- Survival Of The Oldest
- Nanoparticles Hold Promise As Double-edged Sword Against Genital Herpes
- One Third Of Antibiotic Prescriptions Unnecessary
- "In response to your point about building pressurized structures: something we have no experience..."
- "https://i2.wp.com/iceagenow.info/wp-content/uploads/2011/11/Cooking-the-thermometer.png ..."
- "Top US scientist? He hasn’t published a paper in 50 years! He’s an interesting case. In 1990s..."
- "Ya, and your science consists of what? Conclusions based on data that has been homogenized (manipulated)..."
- "Just to prove that you know not what you speakith, the history of the study of logical fallacies..."
- Bed Bugs May Have A Favorite Color
- NASA: Rising CO2 Will Help Food Crops
- NRDC Sues EPA Over Fracking – Has The Settlement Been Pre-Arranged?
- Vaccinate Pregnant Moms to Protect Babies from the Flu
- Pure Unstructured Structured Water – Because Some People Will Buy Anything
- Critical-Care Drug Shortages Continue, Despite FDA Action
- Ketamine lifts depression via a byproduct of its metabolism
- National hospital system uses enterprise approach for assessing bleeding risks
- Comparative analysis reveals use patterns of deeper Caribbean coral reefs by shark species
- Call to re-examine '14-day rule' limiting in vitro human-embryo research
- Study reveals safety and feasibility of robotically assisted PCI in complex cases