Back to my office in Padova, I am looking back at last week's travel around the US and the two talks I delivered at SLAC (the Stanford Linear Accelerator Center) and Northwestern University. 

The event at SLAC was an experimental seminar. Due to a clash with a "Higgs coupling" workshop that was taking place at the same time, it did not attract a very large audience. Still, it was quite nice to meet a few of the SLAC scientists there, and in particular to chat with Stan Brodsky, a well-known theorist whom I had met in Valparaiso earlier this year. I am also grateful to Brandon Eberly, my host at SLAC, who took care of welcoming me there and introducing the seminar.

It was the colloquium at Northwestern what justfied the whole trip. The chair of the Physics department at Northwestern is Michael Schmitt, a longtime colleague in CDF and CMS. Michael was my boss during my post-doc years at Harvard, when we put together the upgrade of the CMX muon system for CDF II. More recently he has also been a colleague of mine in the CMS Statistics Committee for four years; we share a keen interest for statistical issues in data analysis, so convincing him to invite me for a colloquium loosely based on that topic was quite easy.

In fact, Michael made sure that my time at Northwestern would be well spent. Together with Mayda Velasco, who is also a longtime colleague in CDF and CMS, he organized a tight schedule of meetings with a few faculty members of the Physics and Astronomy department there, as well as lunch and dinner with others. Since this was my first "colloquium" at a US university, I was not accustomed to this ritual, but I soon found I was enjoying it a lot.

So I met in succession dark matter searchers (Eric Dahl, Tali Figueroa-Feliciano), HEP theorist Ian Low, an exoplanet hunter (Ben Nelson), and a CMS colleague (Kristian Hahn), as well as a few of the post-doctoral scientists who work with Michael and Ian. In all cases we had very interesting conversations about their research and selected topics from my colloquium. 

Eric explained to me how the modern bubble chamber experiment they have built works -a 60kg supersature freon vessel set at SNOLAB, where dark matter interactions can give rise to single freon bubbles that get recorded by imaging devices. The instrument has very low backgrounds, but these are hard to model, so we discussed in some detail how to keep them under control and study them. We ended up discussing data blinding techniques, their ups and downs, and connected statistical issues.

Tali described to me his work with the CDMS dark matter experiment, and again we ended up discussing statistical issues connected with dark matter searches. As he was not going to attend my colloquium due to a previous commitment, I gave him a very short summary of the topic I would be addressing, and we discussed the matter in some detail.

Ian Low took me to lunch together with two post-doctoral scientists at the Business school restaurant of NWU, which is located in a very nice building overlooking lake Michigan. The lunch discussion ranged over multivariate analysis techniques and their application to new physics searches, as well as over the issue of what a large experiment should do when it observes an anomalous signal that may potentially be the first hint of new physics, or prove a sheer fluctuation of the data.

In the afternoon I was pleased to pay a visit to Ben, who hunts exoplanets through extremely precise measurements of star motions performed with Doppler techniques. His focus is to fit the data in search for multi-planet systems, which are very complex to model. We ended up discussing the pros and cons of Bayesian techniques, which he uses in his data analysis, and the Jeffrey-Lindley paradox, a topic which would be part of my colloquium later on.

Finally I met Kristian Hahn, who searches CMS data for dark matter signals in association with top quarks and other signatures. As Kristian has also been a colleague in CDF before coming to CMS, we had a recollection of those old times, besides discussing details of our current experiment's searches.

And then there was the colloquium, which despite the time (4PM on Friday afternoon!) attracted about 100 among faculty members, researchers, and graduate students. I initially thought they were mostly interested in the coffee and cookies that were offered before the start of my talk, but then I was pleased to see that they stuck to their chairs until the end. 

Michael's introduction of my talk was memorable - he knows me well and he presented me in a very positive way as an expert of statistics techniques, also stressing my outreach efforts. He did not fail to mention the book I have written, which is loosely related to the material I presented in the colloquium presentation.

The colloquium was followed by a very pleasant and tasty dinner in downtown Evanston, offered by NWU. I could not have imagined a better organization and welcome in Northwestern, and I am grateful to Mayda and Michael for it. 


Contents of the colloquium

The title of the colloquium was "Extraordinary Claims: the 0.000029% Solution, or Anomalies in Collider Data". In it I gave a short introduction of particle physics searches for new physics as well as of hypothesis testing, and then focused on the history of the five-sigma criterion in particle physics. This led to a discussion of the pros and cons of the particular discovery threshold that fundamental physicists have settled on, and a recollection of several anomalous effects found in CDF data in the nineties; the 5-sigma criterion did not always work well in those cases. I then focused on the rationale for the criterion (namely, the protection from trials factors and from unknown or ill-modeled systematic uncertainties), and pointed out the shortcomings of a fixed discovery threshold. 

I then discussed the Bayesian perspective and the approach that is usually taken in that framework, with Bayes factors guiding the way toward the selection of one of the other hypothesis. A discussion of the Jeffrey-Lindley paradox followed, to show how in the very specific but quite common case of the test of a simple null hypothesis versus a composite alternative hypothesis the Bayesian calculation fails to deliver what it promises. Indeed, in those conditions a dependence on the prior belief of the experimenter is not washed away by even extremely precise data: the conclusions that different physicists may take on the analysis of a large body of evidence remain dependent on their preconceptions.

I concluded by discussing a table originally cooked up by my friend Louis Lyons, and then slightly modified by me to reflect my personal bias. For each of several interesting past and current searches for new phenomena listed in the table are considered not only the trials factor and the presence of hard-to-model systematic uncertainties, but also the degree of surprise of the effect sought and the importance of a discovery for the progress of science. I think such a "global view" and a comparison of these factors for different searches might lead to agreeing on more meaningful discovery levels.

Just for fun, I attach the table below. Note that the significance levels assigned in the last column reflect my personal bias and you might come to a different conclusion upon examining the various inputs...