The Standard Model of particle physics has been under attack since its original formulation, in 1967, and yet it has so far resisted every assault; in so doing it has become one of the most thoroughly tested physical theories. Like it or not, the construction has stood the test of time so well that theorists and experimentalists alike feel threatened by the chance that the Large Hadron Collider, too, will fail to find new physics beyond what the model predicts.

Of course, the LHC experiments ATLAS and CMS will not remain empty-handed even if new physics should really prove to be far from "behind the door" (an expression that has been used in the past to picture the imminence of a discovery of Supersymmetry, for instance). A Higgs boson is in the cards, as are a number of quite rare phenomena involving multiple vector boson production, three top quarks, Higgs-strahlung processes, vector boson fusion reactions, exclusive productions, rare decays. Experimental physicists will not get a Nobel prize for such things, but they will still have a lot of fun measuring these processes. Theorists, on the other hand, will be at risk of moving to the greener pastures of astroparticle physics and cosmology!

Because of these wishes for a final breakdown of the Standard Model, any inconsistency of theory predictions with experimental measurements automatically get under the microscope, and even 2-sigma discrepancies cause excitement and at times produce flurries of theoretical speculations. But not every inconsistency gets the same treatment: when systematic uncertainties are not or cannot be estimated with sufficient confidence, the discrepancy is likely to soon be archived and stop causing speculations about new physics sources.

Such is the case of the NuTeV measurement of a quite critical parameter of the Standard Model, the so-called Weinberg angle, . NuTeV is an experiment at the Fermi laboratories near Chicago. It operated in the years 1996-1997, when it received a beam of neutrinos produced by the Tevatron (which was operating in fixed-target mode back then, after the start of the upgrade of the CDF and DZERO experiments toward Run II), and recorded neutrino-nucleon interactions in its core.

With the bounty of neutrinos hitting NuTeV, one may produce a statistically very precise measurement of the Weinberg angle, using the different probability of an interaction of neutrinos and antineutrinos with matter, ones involving the emission of virtual W bosons (so-called "charged current" reactions, because the W boson is electrically charged) and Z bosons ("neutral currents"). However, the simple comparison of the reaction rates is affected by quite subtle effects, which produce systematic uncertainties that are hard to evaluate.

Back in 2002, NuTeV produced an estimate of the Weinberg angle which was 0.2277+-0.0013+-0.0009, when the theory prediction indicated instead a smaller value, 0.2227+-0.0004. This amounts to an over three-standard-deviation discrepancy: a quite serious one if uncertainties were to be believed. From the original Fermilab press release we read:

"Our picture of matter has held true for thirty years of experimental
results," said Fermilab Associate Director Michael Shaevitz, a NuTeV
cospokesperson. "With the NuTeV result, it's possible we may have stumbled
across a crack in the model. As yet, we don't know the explanation, but we
believe it may foreshadow discoveries just ahead at accelerator
laboratories."

NuTeV collaborator Kevin McFarland, assistant professor of physics at the
University of Rochester, emphasized that the NuTeV measurement would not be
so striking if the experiment had not achieved an extraordinary level of
precision, unprecedented for a neutrino experiment of its kind.

"Because we examined the interactions of millions of neutrinos and
antineutrinos, their antimatter counterparts," McFarland said, "we
determined that there is only a one in four hundred chance that our
measurement is consistent with the prediction. Unless this is a statistical
fluke, it looks as if neutrinos may really behave differently from other
fundamental particles. Further, experimenters using the Large Electron
Positron at CERN, the European Particle Physics Laboratory, recently
measured this same neutrino interaction in a different particle reaction.
They saw the same discrepancy we found, although with less precision. The
consistency between these two very different measurements is striking."


What I find striking, by re-reading the above text, is that reference is made to a "statistical fluke". A strong statement, which neglected to mention the main suspect causing the discrepancy: some underestimated systematic uncertainty. Another statement sends shivers down my spine:
The experimenters reported a three-sigma discrepancy in sin2thetaW, which
translates to a 99.75 percent probability that the neutrinos are not behaving like
other particles.

We have to pardon the Fermilab press office for this horrendous slip - it tells of a vast misunderstanding of the meaning of hypothesis testing. Even if the three-sigma effect were genuine, a 0.25% probability that the measurement is compatible with theory WOULD NOT MEAN that there is a 99.75% probability that another hypothesis is true, for God's sake!

Anyway, despite the rather enthusiastic press release (those were years when there were no bloggers to crucifige, so it was the laboratory's own task to overhype its experiments' results!), physicists around the world were not too convinced. There were, it is true, theorists who took on the effect seriously, and produced hypotheses that new fancy Z' bosons were exchanged by neutrinos with matter, creating the observed asymmetry and the resulting mismatch of the  Weinberg angle measurementwith predictions excluding those new bosons from the picture. But the suspect that the tiny systematic uncertainty quoted by NuTeV had been underestimated remained lingering.

Hypotheses for the discrepancy were put forth: the interaction between neutrinos and the target in NuTeV, and the resulting rates, depend on details of how the target is composed and on subtle effects which I will not describe here -more for the fear of misreporting things than for the difficulty of the task of making them understandable. The Particle Data Group in the summary by Erler and Langacker (Electroweak Model and constraints on new physics) reported already in 2004 these doubts:

"It is conceivable that the eect is caused by an asymmetric strange sea. A preliminary analysis of dimuon data  in the relevant kinematic regime, however, indicates an asymmetric strange sea with the wrong sign to explain the discrepancy [72]. Another possibility is that the parton distribution functions (PDFs) violate isospin symmetry at levels much stronger than generally expected. Isospin breaking, nuclear physics, and higher order QCD eects seem unlikely explanations of the NuTeV discrepancy but need further study. The extracted g2L;R may also shift if analyzed using the most recent set of QED and electroweak radiative corrections."
Because of these doubts, the NuTeV determination of the Weinberg angle was not used in global fits of electroweak observables. In the meantime, a better understanding of the subtleties behind the composition of the target and its interactions with neutrinos was collected. Today I read in a interesting preprint titled "Reassessment of the NuTeV determination of the weak mixing angle" that the matter appears finally settled. The paper re-evaluates some of the ingredients in the calculation of the Weinberg angle produced by NuTeV, as well as the associated systematic uncertainties.

And what do the authors find ? They find that the corrected NuTeV result is 0.2221+-0.0013+-0.0020. Note that the central value has moved very close to the theoretical prediction (0.2227), and that the systematic uncertainty has more than doubled from the original NuTeV publication (from 0.0009 to 0.0020).

The conclusion ? The Standard Model is as healthy as ever; yet one controversial discrepancy is reduced to a perfect agreement with theory; and we also learn that sometimes, 3-sigma effects do not arise because of "statistical flukes", but rather because of underestimated systematic uncertainties. On the other hand, new physics models see their allowed room shrinking further