Physicists need large integrated luminosity to explore rare phenomena, and high energy to probe processes which may only "turn on" above a certain threshold. So given the x2 luminosity so far collected (but we will get to x5 by the end of the year!) and x1.14 energy, the discovery potential of 2012 data is already several times larger than that of 2011 data.
It is not too hard to figure out why 8 TeV are better than 7: if there existed a new particle with mass of 7.5 TeV, you would never be able to produce it with 7 TeV collisions.... But unfortunately, neither could you with 8 TeV!
To understand that, first you need to reckon with the fact that while protons collide at 8 TeV (or 7), the hard interaction that gives rise to a potentially new particle creation or new phenomenon never reaches such energy: that is because there are many quarks and gluons (collectively called "partons") in each proton, and what produces the hard collision is not a pair of protons, but a pair of partons; the protons are like soft bags containing hard things. So the chance that the two partons which collide are those carrying the largest fraction of the total energy are slim - indeed the highest-energy collisions we have so far observed involve quarks and gluons whose total center-of-mass energy is in the 3-4 TeV range, and are quite rare.
So, fine. Say we want to produce a 3-TeV resonance. This is certainly possible with 8-TeV collisions, and less rare than with 7-TeV collisions. But it is still quite, quite rare - only once in several trillion times, or even less depending on what particle you want this to be. So here is why we need more data. If the new particle has a "cross section" of 1 femtobarn, that means we need on average to produce one inverse femtobarn of proton-proton collisions to have a good chance of getting one of them, since the average number of particles produced for a given process given some data is given by the formula
N = σ L
where σ is the cross section, and L is the integrated luminosity corresponding to that data.
Now, one event is usually not enough to prove we have discovered something: backgrounds can always mimic the production of a signal, so we need to collect several events all of the same characteristics to be able to prove that there is a new signal in the data.
If you will, the above may be taken to mean that a proton-proton machine is more limited by the amount of data it collects than by the total energy of its beams ! This is not entirely correct but it is actually not far from the truth, either.
To shed more light we can contrast the LHC situation with the operation of the LEP II collider, which accelerated electrons and positrons instead than protons, at up to 209 GeV center of mass energy in the same ring now hosting the LHC, until 2001. Since electrons and positrons are elementary bodies, the collisions deliver all the energy of the projectiles to the hard interaction.
In a quest to produce the Higgs boson through the reaction e+e- --> ZH, that is associated with a Z boson, CERN increased as much as possible, in steps (inserting new accelerating cavities in the ring where possible) the center of mass energy of LEP II, without caring too much for the amount of data collected at each step: all it mattered was the energy, because the cross section of the searched process was not too small, and every collision delivered exactly the total energy of the machine. We know how the story ended: they could say the Higgs was heavier than 114.4 GeV because they did not see a significant excess of ZH candidates; had the Higgs been lighter, they would have seen an excess. If you sum the minimum mass of an escaped Higgs boson with the mass of the Z boson you get 114.4+91.2=205.6 GeV, close to the actual collision energy -some of the energy does not go into mass but into momentum of the final state products.
Protons, unlike electrons, are not elementary, but are made of quarks and gluons, so I hope you now understand why the LHC experiments have only just begun their investigation of the high-energy frontier: getting more data will be needed to say we have exhausted the potentialities of the machine.
There is now a schedule of future LHC operations that looks ahead at least 10 years from now - in the course of this time, CMS and ATLAS will be able to collect hundreds, or even thousands, of inverse femtobarns of data. What new things may appear when we get to analyze that much data, we can only wonder. And of course, the fixing of safety systems in the ring will finally allow the collider to run at 13 or 14 TeV. In the meantime, experiments will require to be upgraded, partly to match the higher number of collisions per bunch crossing of higher luminosity running, which make it harder to reconstruct the trajectory of every particle; and partly to replace systems worn out by high doses of radiation, or to introduce newer technologies in apparata that while still quite advanced and new-looking, but will not stay that for long.