*[Above, the production rate of different final states measured by CMS is compared to SM theory predictions. each column is a different physics process, the dots are measurements at different center-of-mass energy, and the grey bands are the theory predictions.]*

What am I talking about ? I am talking about the fact that the rate of multiple particle production, like pairs of W bosons or W plus N jet events, or ttH final states, while extremely small, is large enough that we can discern it over a background we usually neglect to talk about. I am talking about what is generically called "pileup". This is an annoying but manageable background to all our physics measurements.

Pileup occurs when we collide bunches of particles against each other. E.g. at the LHC we try to squeeze in narrow packets as many protons as we can, because the larger the number of protons the higher is the interaction rate, which allows us to study very rare processes. The fact that in a single crossing of two proton bunches in the center of ATLAS or CMS several proton-proton interactions take place simultaneously is both good and bad: it is good because it increases the total number of collisions we produce, but it is bad because it increases the complexity of the snapshots we take of the produced debris.

Let us talk about numbers. At its design luminosity of 10^34 cm^-2 s^-1, which means "10^34 collisions per square centimeter per second", a single bunch crossing in the LHC actually produces a luminosity of "only" 2.5x10^26 cm^-2 , as the bunch crossings occur at a rate of 40 millions per second. What does that number correspond to ? In order to know the average number of collisions produced we need to know the cross section of the considered phenomena.

For instance, the "total inelastic cross section" for two protons is about 10^-25 cm^2. By multiplying this by the per-bunch-crossing luminosity computed above you get 25 collisions per bunch crossing. Twentyfive is the average number of inelastic collisions taking place at the same time - ok, well, within less than a nanosecond to be precise.

The rare processes we are interested in are much less frequent. So, for instance the production of a top quark pair occurs with a cross section of 8*10^-34 cm^2, hence in a single bunch crossing the average number of interactions producing a top quark pair is 8x10^-34 x 2.5 x 10^26 = 2x10^-7. Only once every ten million bunch crossings does such an event take place!

Or let us take ttH production - a process we are quite interested to measure, as it allows us to size up how strongly the Higgs particle couples to top quarks. According to the Standard Model, ttH production should occur with a cross section of 5x10^-37 cm^-2 - so over 1000 times less frequently than top pair production. This means that the expected rate in a bunch crossing is 1.25x10^-10.

Finally let us try inclusing Higgs production - the production of a Higgs boson by itself. This has a cross section of 5x10^-35 cm^2, hence an expected rate per bunch crossing of 1.25x10^-8.

What do we do with those numbers ? We can prove one interesting fact. Indeed, if we collected a sample of ttH events we in principle would have to consider the chance that the top pair was produced by one collision, and the Higgs was produced by another one occurring during the same bunch crossing! Fortunately, the numbers just make this a negligible occurrence. In fact, we just need to do some trivial mathematical calculation to get the number of bunch crossings where a tt pair and a Higgs boson appear independently.

The rates we are talking about are distributed with a Poisson density function. The number we want is the probability that at least one tt pair is produced times the probability that at least a Higgs boson is produced. The former is 1-exp(-p(tt)), where p(tt)=2x10^-7 as we computed earlier. The latter is 1-exp(-p(H)), where p(H) is 1.25x10^-8.

The result is P(tt&H) = 2.5x10^-15. This number is tiny, but what's really important is that it is also tiny with respect to 1.25x10^-10, which is the rate of "true" ttH production from a single hard interaction. In other words, if we collect a sample of ttH candidate events in LHC collisions at 13 TeV with the machine running at 10^34 cm^-2 s^-1, the fraction of events where the top quark pair and the Higgs boson are really coming from different collisions is only 2x10^-5, a negligible number.

I should also have discussed the fact that experimentally there are many ways to identify the exact space point where the collisions take place. The bunch crossings spread collision points along the beam line by about 10 centimeters, while we can identify the origin of the particles coming out of the interactions with a precision of few hundred microns. In other words, we have ways to tell apart the "pileup" tt & H production from the true ttH production, by looking at the point of origin of the particles created in the decay of the three produced bodies.

Now, however, imagine we could have built a collider which brought to collide in a single point of space protons at a much higher rate - say 10^40 cm^-2 s^-1, again with 40 million bunch crossings per second. Under those conditions the average number of ttH events per bunch crossing would be a million times higher than previously calculated - hence 1.25x10^-4. And what about the pileup tt&H ? That would benefit from the higher number of collisions per bunch crossing a lot, as we would get a rate of 2.5x10^-3, hence twenty times higher than the true process! Under those conditions, a candidate ttH event would be 20 times more likely to be actually the collection of two independent collisions!

The bottomline is that these machines allow us to study a wealth of physical processes thanks not only to sheer power, but also - and I would say mostly - because of careful design choices. Of course nobody can design a collider with a luminosity of 10^40 cm^-2 s^-1 these days, but on the other hand the number of bunches of the beams has been chosen wisely enough that we can discern everything we need from the messy collisions - they are messy, but not messy enough to cause unsolvable experimental problems!

However, in the LHC case, if we were to consider a dataset of 10^15 bunch crossings, then the statistical significance of the ttH signal would be approximately S/sqrt(S + B) = 125000 / sqrt(125000 + 2.5) = about 350,

but in the case you mention of a hypothetical collider which collides in a single point of space protons at 10^40 cm^-2 s^-1, then a dataset of 10^15 bunch crossings would have a ttH statistical significance of approximately S/sqrt(S + B) = (1.25 x 10^11)/sqrt((1.25 x 10^11) + (2.5 x 10^12)) = about 77000.

So you'd definitely get a huge amount more background, but I think the increase in signal would more than make up for it -- unless some big systematic effect were to cancel out the increase in statistical significance.