Ben Allanach: Impact Of The CMS Supersymmetry Search On Global Supersymmetric Fits
    By Tommaso Dorigo | February 23rd 2011 03:53 PM | 27 comments | Print | E-mail | Track Comments
    About Tommaso

    I am an experimental particle physicist working with the CMS experiment at CERN. In my spare time I play chess, abuse the piano, and aim my dobson...

    View Tommaso's Profile
    Ben Allanach is a professor of theoretical physics at the University of Cambridge. Before that he was a post-doc at LAPP (Annecy, France), CERN (Geneva, Switzerland), Cambridge (UK) and the Rutherford Appleton Laboratory (UK). I noticed a recent article of his in the arxiv, and asked him to report on it here, given the interest that the recent LHC results have stirred in the community. He graciously agreed.... So let us hear it from him! 

    Blimey, I'm tired. I'm also elated and excited and grateful to my lovely girlfriend, who's not only putting up with my long hours, distracted head and general ensuing grumpiness, she's even looking after me. After twelve years of preparation, developing skills, mathematical algorithms and computer programs, papers are coming from the LHC experiments that are strongly constraining models of particle physics beyond the Standard Model. So, I've been pushing my limits, working feverishly hard on interpreting the data coming from the experiments while still performing the usual University duties, lecturing, examining and committees.

    Twelve years ago I came to Cambridge as a post-doc at the Uni, and bumped into Prof. Bryan Webber having a glass of wine after a physics society talk in the physics department. We got chatting about scuba diving (Bryan's a keen diver and I'd wanted too give it a go) when he suddenly said "Oh yes, you've worked on supersymmetry before haven't you? Why don't you come along to a weekly meeting we're setting up with experimentalists here in the physics department?" At the time (and now), I was in the department of applied mathematics and theoretical physics (DAMTP). I'd worked on supersymmetric models beforehand, but not in terms of actually searching for supersymmetric particles at colliders. This was what the weekly meeting in physics was aimed at: the simulation and analysis strategies for finding new particles, with a particular eye on the LHC. The group became known as The Cambridge SUSY Working Group, and it's brilliant! The local theorists and experimentalists get together every week and discuss everyone's project of the moment: we give each other help and advice. People that are away at CERN still phone in to the conference, and our results are shown on private web-pages. I've learned so much from this group, and feel very lucky to be a part of it. This term I can't go to the meetings, because they clash with some lectures on supersymmetry (SUSY) I'm giving to master's students (curses). Anyway, attending this meeting and working with colleagues there has taught mean awful lot about interpreting data, experimental analyses and searches. There's no way I'd have been able to even contemplate interpreting recent data from the CMS detector at the LHC without the Cambridge SUSY Working Group.

    The CMS results came out some 6 weeks ago: they were looking for any LHC proton collisions with high energy jets of particles accompanied by missing momentum. This can be interpreted as a search for supersymmetric particles (squarks and gluinos) because they decay into high energy jets of particles and neutralinos, which carry off momentum because they are invisible to the detector (they are the particle that many people think might constitute dark matter in the universe). CMS didn't find any significant excess of events with the right properties: so many squarks and gluons couldn't have been produced. The experimental paper produced exclusion limits in the plane of m_0 (related tothe squark mass) and m_1/2 (related to the gluino mass).

    The models with masses below the red line are ruled out by the search, whereas heavier squarks and gluinos are allowed. That's because the LHC collisions haven't had enough energy to produce them yet (since E=mc^2, and so you need enough E to make more m). This is a simplification of course: actual collisions in the LHC are between constituents of the protons, and they carry some random fraction of the proton's energy in each collision. So, to some extent, you can probe further by just colliding more and more protons - thatis what's going to happen this year. But the big effect is when the energy of the collisions increases a lot: this is why the LHC results are going further than previous experiments (shown by the different coloured regions near the bottom of the plot).

    Anyway, in the intervening 6 weeks, I've been working out what the results mean for simple supersymmetric models globally, once we take into account other constraints too. For example: if neutralinos really are the dark matter, then there are only some parts of the m0-m12 plane that give it density that's observed in the universe today. Also, there's a funny effect in the anomalous magnetic moment of the muon that (slightly) prefers quantum corrections coming from non Standard Model particles. This effect would like supersymmetric particles to be light. So we have a tug of war: under the hypothesis that supersymmetry is correct, some measurements would like the supersymmetric particles to be light, whereas the LHC is saying "they ain't that light". In this situation, you can do what's referred to as "a global fit". I wish that meant that everyone around the world suddenly dropped to the ground having a screaming temper tantrum. But a global fit really a way of carefully balancing the competing demands of the data on supersymmetric particles. We expect that the CMS results will push the supersymmetric particles to be heavier in the global fit, but the question is: how much? And are there any other less obvious effects? Knowing what the masses of the supersymmetric particles are likely to be allows us to make "Weather forecasts" for the next year of data taking: is it going to be raining supersymmetric particles, or will it be a supersymmetric desert? Actually, there is a very close analogy with what is done in weather forecasting, which is also done in an uncertain, somewhat random world. Also, we want to know how constraining the new searches are: are they reaching the supersymmetric masses that really fit the rest of the data (like dark matter etc) well?

    What you see here in the

    plot from my paper is the m0-m12 plane (sorry, it's flipped by a right angle compared to the one above), with the most likely place on it being the lighter colours, taking into account all of the data together. If the simple version of supersymmetry that I'm considering is right, there's a 95% chance that it lies within the outer turquoise curves. You can turn this into something more intuitive: a prediction of the squark mass probability, for example. That's shown here:

    It's a "probability distribution": the higher the histogram, the higher the probability of the squark mass shown on the horizontal axis. This plot shows the difference made by the recent CMS search: before the search, I got the blue histogram, whereas including the search I get the red one. We see that small squark masses around 500-600 GeV become much less likely, but there is an interesting effect: squark masses in the range 800-100 GeV actually become more likely. That's because CMS saw a slight (not significant) excess in the number of collisions it was looking for, which prefers these intermediate masses. Anyway, the conclusion that with 95% probability, the masses remain below 2000 GeV bodes well: if weak-scale supersymmetry is the correct theory, it seems that the supersymmetric particles are likely to be light enough for the LHC to discover them, although they might take the energy upgrade to 14 TeV total energy in a couple of years).The frenetic work continues with new papers being released every week, all based on last year's data. I worked on the CMS results alone, because my Cambridge Supersymmetry Working Group friends were too busy working on important experimental analyses to help. You may ask how I've now got time to write this blog? Well, luckily, some of my SUSY working group colleagues (Teng Khoo, Chris Lester and Sarah Williams) have now got some time to play with me on the next paper on ATLAS results, which is a huge relief. My girlfriend is still being lovely to me, despite the strains I'm placing on myself/us. Imagine how feverish we will get if/when there's actually a strong new physics signal: I imagine there'll be a lot of Star Wars in physicists' marriages (May Divorce Be With You).


    Bonny Bonobo alias Brat
    Thanks for this great article Ben. In it you mention 'missing momentum' and often other articles on this subject also mention 'missing energy' but don't usually elaborate much more. You say that  :-
    they were looking for any LHC proton collisions with high energy jets of particles accompanied by missing momentum. This can be interpreted as a search for supersymmetric particles (squarks and gluinos) because they decay into high energy jets of particles and neutralinos, which carry off momentum because they are invisible to the detector (they are the particle that many people think might constitute dark matter in the universe).
    Is it possible to elaborate a bit more on either what this missing momentum or missing energy is, why it is invisible to the detector, how it is calculated, how accurate those calculations are or where the missing energy is going? For example, how much missing momentum or energy is generated in a day or a week at CMS or ATLAS doing these experiments? Why is missing energy commonly used to infer the presence of non-detectable particles such as the standard model neutrino and why is it expected to be a signature of many new physics events? What evidence is there for this, if it is missing momentum or energy that can't be detected? Sorry if these are stupid questions but I'm having difficulty finding the answers. Maybe you could just point me to an informative link, as I realize that you are very busy?
    My article about researchers identifying a potential blue green algae cause & L-Serine treatment for Lou Gehrig's ALS, MND, Parkinsons & Alzheimers is at
    Hi Helen,

    On the subject of the missing momentum or energy (for our purposes here, they are the same thing - there is a techinical difference which causes physicists some confusion sometimes, so I chose my words carefully):
    In school physics we learn that momentum can't be created or destroyed, usually by doing experiments with bumping objects together on low friction surfaces. It's the same for the LHC collisions: before-hand, the momentum of each proton is 3500 GeV but in opposite directions, so the total is zero. The total momentum must remain zero after each collision. If we see a significant total momentum in one direction, we interpret that as an invisible particle taking off momentum in the opposite direction, balancing everything out. This is how neutrinos were initially discovered.
    The particles are "invisible" because they don't feel strong nuclear or electric forces: they are weakly interacting. These are precisely the properties that cold dark matter needs to have, and it is the property of the neutralino, the supersymmetric particle that we are proposing carries off the missing momentum.
    These things are so weakly interacting, that the chance of them interacting with some nucleus or electron in the detector is negligible and so they pass through unnoticed. To detect weakly interacting particles directly, you firstly need a hell of a lot of them, then you need a huge detector. That's why neutrino detectors involve massive tanks of water (50000 ton) surrounded by kit that detects light (look up SuperKamiokande): you have to wait until one of the billions of neutrinos going through the water has a weak interaction and produces some other particles that do feel electric forces etc. These then produce light, which the kit sees. But with LHC detectors you can forget it: they are just too teeny to get any weak interactions in them.

    Bonny Bonobo alias Brat
    These things are so weakly interacting, that the chance of them interacting with some nucleus or electron in the detector is negligible and so they pass through unnoticed.
    Thanks for your reply Ben, I wondered if after the neutrinos have passed through unnoticed could the The Mikheyev–Smirnov–Wolfenstein (MSW) or 'matter effect' have the potential to modify these neutrino's oscillations, to make them less weakly or more strongly interacting, even possibly allowing them to become weakly attracted to gravity because they have been created underground in 'matter' within a collider and have a tiny mass? Could they then be traveling through many kilometres of the Earth's matter towards the magnetic core, oscillating into new flavors or forms along the way and somehow even affecting the earth's magnetic core?

    In other words, is it possible that these collider experiments which are searching for things like the the Higgs bosun and SUSY sparticles are inadvertently creating new types of undetectable, oscillating, neutrinos in new ways and at higher energy levels and/or quantities than have been created here on Earth before, which might be very different to the usual ones arriving mainly from the sun through our atmosphere? A sort of reverse solar neutrino problem, and if this was happening, how would we know? Again, I'm sorry if these are stupid questions, I hope to study physics one day but can't at present and I know you are very busy, so a brief reply like 'no' would be much appreciated.
    My article about researchers identifying a potential blue green algae cause & L-Serine treatment for Lou Gehrig's ALS, MND, Parkinsons & Alzheimers is at
    Well, it was possible, but so far there is no evidence that this is happening. The LEP collider produced Z bosons, which decay sometimes into neutrinos. The total number of Z's can be measured from its "width" and used to infer the number of neutrinos: from this, it was deduced that there are only 3 types that the Z couples to with a weak strength that are lighter than 45 GeV. But there's always room for sterile ones which don't interact with Z's...

    If there are additional invisible particles being produced, we should see signals in searches like the SUSY ones: the missing momentum signature is pretty universal, but the question is: what other particles accompany the missing momentum? That varies somewhat from case to case. You need some other particle to "tag" the fact that a collision has happened.

    Bonny Bonobo alias Brat
    So in other words, am I right in thinking that you are saying that no one really knows?
    My article about researchers identifying a potential blue green algae cause & L-Serine treatment for Lou Gehrig's ALS, MND, Parkinsons & Alzheimers is at
    Hi Helen,
    if I may intervene, by you saying that "even affecting the earth's magnetic core" I have an impression that you don't have a good qualitative picture of what's going on with neutrinos... Nothing special will happen to those produced in accelerators that doesn't already happen to cosmic ones - and the ones from accelerators are very small in number to be useful, in comparison.
    I believe that what you're thinking about is more relevant to what's happening at Gran Sasso:
    where you have a real "factory" of neutrinos.

    In general, if I got the questions right, the hot question to ask is: "How can we make sure we detect unforeseen particles?". Ben answers to this. After all, the accelerators are built *in* *order* to have unforeseen particles ;)

    Bonny Bonobo alias Brat
    Thanks for your intervention Tulpoeid, yes it does look as though Gran Sasso may be more relevant to what I'm asking. You say that :-
    Nothing special will happen to those produced in accelerators that doesn't already happen to cosmic ones - and the ones from accelerators are very small in number to be useful, in comparison.
    However, from what I've been reading at Wikipedia, the MSW or 'matter effect'  means that neutrinos in matter have a different effective mass than neutrinos in vacuum, like the cosmic ones you mention, which we are also partly shielded from by the Earth's magnetosphere and since neutrino oscillations depend upon the squared mass difference of the neutrinos, neutrino oscillations may be different in matter than they are in vacuum so when neutrinos go through the MSW resonance here on Earth the neutrinos have the maximal probability to change their nature.

    So if the MSW effect can modify neutrino oscillations in the Earth, and future search for new oscillations and/or leptonic CP violation may make use of this property then why wouldn't this apply to those smaller numbers generated by accelerators as well? Couldn't these neutrinos hypothetically also be able to oscillate eventually into sterile neutrinos which are attracted by the Earth's gravity?

    Wikipedia partly supports this by claiming that Neutrino oscillations are a quantum mechanical phenomenona 'whereby a neutrino created with a specific lepton flavor (electron, muon or tau) can later be measured to have a different flavor and that observation of the phenomenon implies that the neutrino has a non-zero mass, which is not part of the original Standard Model of particle physics' and that a sterile neutrino  is a hypothetical neutrino that does not interact via any of the fundamental interactions of the Standard Model except gravity. However it mixes with the other types of neutrinos.

    Wiki claims that 'Sterile neutrinos would still interact via gravity, so if they are heavy enough, they could explain cold dark matter or warm dark matter. Sterile neutrinos may also mix with ordinary neutrinos via a Dirac mass. The sterile neutrinos and ordinary neutrinos may also have Majorana masses. In certain models, both Dirac and Majorana masses are used in a seesaw mechanism, which drives ordinary neutrino masses down and makes the sterile neutrinos much heavier than the Standard Model interacting neutrinos. In some models the heavy neutrinos can be as heavy as the GUT scale (~1015 GeV)'.

    This paper  discusses the reactor antineutrino anomaly and the possible existence of a fourth neutrino which they claim is slightly favored by some recent cosmological data analysis of WMAP combined with the Atacama Cosmology Telescope and they say that a clear experimental proof of the presence of this fourth non-standard neutrino has become mandatory. They suggest that this can be given by ‘the imprint on the energy spectrum in a very short baseline reactor neutrino experiment or by a new neutrino source experiment in a detector with energy and spatial resolution’. They also say that if the neutrino mixing hypothesis is the correct explanation, this implies the existence of a fourth neutrino, beyond the standard model.

    If fourth generation, heavy, massive, sterile neutrinos are being generated by the matter effect upon oscillating neutrinos generated by either accelerators or other particles collidor or nuclear reaction experiments as they inadvertently pass through the earth, couldn't they then be attracted  towards the earth's magnetic core? I read somewhere that the north and south poles are moving approximately 50 kilometres a year since the 80's when these sort of experiments began on a larger scale and that they were much more static prior to that. How do we know that sterile neutrinos aren't somehow affecting the workings of the Earth's magnetic core and the location of it's poles when they arrive there? Again I'm sorry if these seem like stupid questions.
    My article about researchers identifying a potential blue green algae cause & L-Serine treatment for Lou Gehrig's ALS, MND, Parkinsons & Alzheimers is at
    Bonny Bonobo alias Brat
    Ben, I should have also asked 'and is it possible that this could make these  man-made neutrinos oscillate and travel faster than light?' 

    I wonder what your opinion is about these recently reported, Gran Sasso, hypothetical, superliminal, tachyon neutrinos really existing and if they do what effects do you think they could be having here on Earth and elsewhere in the universe, as they theoretically keep accelerating towards infinite velocity via Cherenkov radiation? What would happen if they eventually slowed down via another unknown or reverse anti 'matter-effect' process and they became slower than the speed of light somewhere in the infinite universe, could that release infinite energy and create another 'Big Bang'? Again, I'm sorry if these are stupid questions.
    My article about researchers identifying a potential blue green algae cause & L-Serine treatment for Lou Gehrig's ALS, MND, Parkinsons & Alzheimers is at
    Nice article.

    "squark masses in the range 800-100 GeV actually become more likely"

    You meant 1,000 GeV, not 100 GeV. Well, my readers have known that 900 GeV is the preferred value for gluino and squark masses based on the evaluation of all the previous data. Antisupersymmetric jihadists should better start to save the money for their bets.

    Thanks, Lubos. D'Oh! Yes, a slip of the keyboard: I did indeed mean 1000 GeV...

    Good nickname. But do you know why we don't live in Matrix? The food would be better. ;-)

    More importantly than the blog entry, the paper is really good. You can do pretty much the same things that a dozen of authors - whom I respect - can only do as a large group. :-)

    If those colored superpartners are at 900 GeV, will it take half a year to see them at 3 sigma or more time?

    For others, if you want to know whom I mean by Ben's competitors, see

    Dear Ben, sorry for not reading your paper yet, but still a question only about this post might be useful to all of its readers. Now I'm not asking this as an opponent to susy (which I am:), but supposing that you were talking about anything in the world:

    You find that the probabilities for the best mass estimate change, after taking into account that some of the lowest ones are already excluded (yes, always talking up to a certain confidence level). Why is this more important than your previous result?
    My concerns in more detail are: (i) the total probability still remains 1 so I wouldn't fret over the distribution becoming steeper somewhere after one more constraint, while I'd be interested in learning if the probability for the squark existence took a blow or not, (ii) your new data gave you more constraints and your fitting function changed shape because your new data gave you more constraints, which is their job to do ... is there a look-elsewhere-effect here as well? Is there a reason for the evolution in shape between every update of the data to be something to care about?

    I repeat that I'm not trying to make fun or anything, but I feel that I'm seriously missing the point of the post - maybe because I was expecting a different subject after all this fuss, but still I'll never know if I don't ask!


    Hi Tulpoeid,

    In answer to your concerns:
    (i) You're asking for a hypothesis test. This is basically an unresolved problem: there are problems with whichever stastical technique one uses to try to put a probability to "the squark existence". Parameter estimation, which I am doing, on the other hand is well understood.
    (ii) There is the question of really how constraining the new CMS search data are. I didn't go into this in the blog, but the answer is: CMS data are really only just beginning to nibble away at the edge of the high probability region. The recent ATLAS data though, will actually take a reasonable bite.

    I guess one thing you can take away is that despite many of the claims being made elsewhere, there is still plenty of room for supersymmetry, even SUSY which fits the other indirect data. There's also plenty of room for it to be found later this year (which will reach up to 1000 GeV), or in years hence when the energy is upgraded.

    The point of my post was more for non-specialists to get a feel for what doing this kind of work is like and why we do it... Oh, and no worries - SUSY opponents are my friends! I'd only give weak scale SUSY a 50/50 chance (although that's much higher than other alternatives except maybe the good ole Standard Model up to the Planck scale). In a few years, or maybe sooner, I'll be glad we can stop all of this betting and just listen to what the data tell us...

    Thanks, I think that letting people peek into how your work is done, is indeed done succesfully by this post. My confusion was probably brought about by the recent hype and the hope for a post with a different focus -- and looking back, I think it might have been in particular brought about by a comment by Tommaso yesterday: "Still, the common attitude is that the exclusion actually increases the global likelihood of the global fits (...)". It was something along these lines that I was expecting.

    After your explanation I decided to see if Tommaso was indeed talking about your article, and it seems he did. With a quick look I see that you use the improved cross-section estimates to support that "the prospects for SUSY discovery next year marginally improve after the inclusion of the aT search results".
    So my initial objection still holds a bit: How can we say that the prospects improve?! The prospects wouldn't change a bit had lhc produced the recent results or not! It's the calculation of the prospects that improved.

    But allow me to conclude saying that I understand you meant they improve with respect to our previous knowledge, although it sounded like hype to me, because yesterday I thought that people who made a fuss over the hype of a mere article title were exaggerating :)

    Hi Ben,

    on the fact that ATLAS takes a bigger hit on the parameter space: please be aware of a subtle but crucial point, which I am sure most readers of the experimental papers will overlook.

    ATLAS in this paper used a method which is quite unconventional (in the sense that it does not belong to a wide list of known methods discussed by Highland in 1986, augmented by CLs and Feldman-Cousins in the late nineties). They basically power-constrain the limit when the data fluctuates more than 1-sigma lower than expected backgrounds, but the rest of their exclusion region for mu versus X (if you know what these parameters are in a confidence belt plot) is much more aggressively excluding mu for any given X than all the other "respectable" methods on the market. They call their method a 16% power-constrained if I do not err. This is because 16% is the one-sigma tail of a Gaussian. They used this same method in another paper (the WW and Higgs search one), but there they explained well what they were doing, and compared their results to one of the "common" methods, i.e. CLs. Here they omitted this crucial step, which is important when one compares results of different experiments.

    ATLAS and CMS have work ahead in order to converge to an agreed-upon way to compute limits.

    Thanks for the update about the statistics, Tommaso. Can you comment on whether CMS will be extending their reach with more conventional jets + missing momentum cuts? Is it that alpha_T was chosen because it was felt to be more robust with respect to QCD backgrounds, but actually is less sensitive than the more conventional searches (of course, the slight excess in the number of events makes quite a difference to CMS's inferred exclusion too, but I guess you can't legitimately do much about that....)

    Re. alpha_T: As someone who was involved in the development of this observable for CMS, I can confirm that it was indeed used due to its robustness against detector mismeasurement in QCD events, and that, yes, this robustness comes at the cost of signal sensitivity (this is actually the subject of my thesis, which I'm writing up now!). A little more information can be found at this talk I gave at the UK IoP HEPP meeting held at UCL last year.
    Thanks for the post, by the way - nice to see our plot being put to good use :-)

    To clarify a bit, the paper by Virgil Highland I mention above is here.

    Great post, thanks for the update!

    A pleasure to read, thanks...and for the SUSY folks.

    It's not over till...

    The weakness of these global fits is that they indirectly fix sparticle masses from the g-2 measurement.
    But nobody knows if it is a real anomaly.

    I think it is a real anomaly (ie not just a chance fluctuation, since there are several different measurements that point in the same direction) - but we agree that it could be due to the Standard Model calculation having additional subtle uncertainties in it, and not due to the quantum fluctuations of additional particles. If you take it out as an observable, the data say SUSY could be somewhat heavier...but then it doesn't solve the technical hierarchy problem so well.

    Two points:
    1) (g-2)_mu has been analyzed by many^2 physicists already. The question whether it could be fixed by discovering
    a new hadronic contribution was analyzed (with negative outcome) as well as the issue of e+e- data vs. tau data
    (with an interesting recent paper by Jegerlehner:
    The overall conclusion: one has to take that 3.x sigma discrepancy seriously. Therefore one has to include it in a global fit. Anything else would be "selecting data according to your liking".
    2) In an earlier paper we analyzed nevertheless what happens if (g-2)_mu is removed from the fit: fig. 2 in We found that the best fit values are essentially unchanged, but the Delta chi^2
    becomes much more shallow. Therefore it is not correct to say that the masses are all driven by (g-2)_mu. The upper (and lower) limits are are influenced by (g-2)_mu.

    Lubos links to this post!!!

    What a piece of rubbish and what a waste of time. What's the point in doing these global fits in the LHC era when experiments exclude new chunks of parameter space faster than you perform your analysis? Just wait and see.

    The prospects wouldn't change a bit had lhc produced the recent results or not! It's the calculation of the prospects that improved.