Bell's Inequality And The Speed Of Light: Quasar Findings Might Close The 'Free Will' Loophole
    By News Staff | February 20th 2014 09:42 AM | 1 comment | Print | E-mail | Track Comments
    In 1964, the physicist John Bell tackled locality and the disparity between classical physics and quantum mechanics, stating that if the universe is based on classical physics, the measurement of one entangled particle should not affect the measurement of the other.

    Locality theory said there is a limit to how correlated two particles can be. Bell devised a mathematical formula for locality, and presented scenarios that violated this formula, instead following predictions of quantum mechanics. Since then, physicists have tested Bell’s theorem by measuring the properties of entangled quantum particles in the laboratory. Essentially all of these experiments have shown that such particles are correlated more strongly than would be expected under the laws of classical physics — findings that support quantum mechanics.

    But there were several major loopholes in Bell’s theorem. While the outcomes of experiments may appear to support the predictions of quantum mechanics, they may actually reflect unknown “hidden variables” that give the illusion of a quantum outcome, but can still be explained in classical terms.

    Since then, two major loopholes have since been closed but a third remains; physicists refer to it as “setting independence,” or more provocatively, “free will.”

    This loophole proposes that a particle detector’s settings may “conspire” with events in the shared causal past of the detectors themselves to determine which properties of the particle to measure — a scenario that, however far-fetched, implies that a physicist running the experiment does not have complete free will in choosing each detector’s setting.

    Such a scenario would result in biased measurements, suggesting that two particles are correlated more than they actually are, and giving more weight to quantum mechanics than classical physics.

    In a recent paper, researchers propose an experiment that may close the last major loophole of Bell’s inequality — if the 50-year-old theorem is violated by experiments it would mean that our universe is based not on the textbook laws of classical physics, but on the less-tangible probabilities of quantum mechanics.

    Such a quantum view would allow for seemingly counterintuitive phenomena such as entanglement, in which the measurement of one particle instantly affects another, even if those entangled particles are at opposite ends of the universe. Among other things, entanglement — a quantum feature Albert Einstein skeptically referred to as “spooky action at a distance”— seems to suggest that entangled particles can affect each other instantly, faster than the speed of light.

    Yes, you read that right. Faster than the speed of light. And not just mathematical sleight of hand either.

    “It sounds creepy, but people realized that’s a logical possibility that hasn’t been closed yet,” says MIT’s David Kaiser, the Germeshausen Professor of the History of Science and senior lecturer in the Department of Physics. “Before we make the leap to say the equations of quantum theory tell us the world is inescapably crazy and bizarre, have we closed every conceivable logical loophole, even if they may not seem plausible in the world we know today?”

    Now Kaiser, along with MIT postdoc Andrew Friedman and Jason Gallicchio of the University of Chicago, have proposed an experiment to close this third loophole by determining a particle detector’s settings using some of the oldest light in the universe: distant quasars, or galactic nuclei, which formed billions of years ago.

    Artistic rendering  of ULAS J1120+0641, a very distant quasar. Image: ESO/M. Kornmesser

    The idea, essentially, is that if two quasars on opposite sides of the sky are sufficiently distant from each other, they would have been out of causal contact since the Big Bang some 14 billion years ago, with no possible means of any third party communicating with both of them since the beginning of the universe — an ideal scenario for determining each particle detector’s settings.

    As Kaiser explains it, an experiment would go something like this: A laboratory setup would consist of a particle generator, such as a radioactive atom that spits out pairs of entangled particles. One detector measures a property of particle A, while another detector does the same for particle B. A split second after the particles are generated, but just before the detectors are set, scientists would use telescopic observations of distant quasars to determine which properties each detector will measure of a respective particle. In other words, quasar A determines the settings to detect particle A, and quasar B sets the detector for particle B.

    The researchers reason that since each detector’s setting is determined by sources that have had no communication or shared history since the beginning of the universe, it would be virtually impossible for these detectors to “conspire” with anything in their shared past to give a biased measurement; the experimental setup could therefore close the “free will” loophole. If, after multiple measurements with this experimental setup, scientists found that the measurements of the particles were correlated more than predicted by the laws of classical physics, Kaiser says, then the universe as we see it must be based instead on quantum mechanics.

    Credit: arXiv:1310.3288

    “I think it’s fair to say this [loophole] is the final frontier, logically speaking, that stands between this enormously impressive accumulated experimental evidence and the interpretation of that evidence saying the world is governed by quantum mechanics,” Kaiser says.

    Now that the researchers have put forth an experimental approach, they hope that others will perform actual experiments, using observations of distant quasars.

    “At first, we didn’t know if our setup would require constellations of futuristic space satellites, or 1,000-meter telescopes on the dark side of the moon,” Friedman says. “So we were naturally delighted when we discovered, much to our surprise, that our experiment was both feasible in the real world with present technology, and interesting enough to our experimentalist collaborators who actually want to make it happen in the next few years.”

    Adds Kaiser, “We’ve said, ‘Let’s go for broke — let’s use the history of the cosmos since the Big Bang, darn it.’ And it is very exciting that it’s actually feasible.”

    Preprint: Jason Gallicchio, Andrew S. Friedman, David I. Kaiser, 'Testing Bell's Inequality with Cosmic Photons: Closing the Settings-Independence Loophole',  arXiv:1310.3288. Source: Jennifer Chu at MIT


    I have a question that I feel is relevant to this topic and might be on the minds of other readers. You'll probably recognize the question and have a simple answer immediately - though I never seem to encounter it in discussions where it would seem to belong.

    I don't want to present this as if I'm a crackpot with a pet theory (it's certainly not remotely original), so excuse me if I sound that way when I describe the concept for readers...

    It's basically very simple. In regard to one of the mysteries behind quantum physics, that of entanglement, locality, Schroedinger's cat, EPR etc. One 'model' that would seem to provide a simple escape from any of the extraordinarily complex explanations would simply be that of what I believe would be described as a super-deterministic universe (if I understand that term). Is this idea ruled out by scientific evidence, or is it simply too unacceptable to our human tastes?

    What I mean, for those who haven't considered the idea (or perhaps so someone can tell me I'll simply awkwardly restating a common concept that was perhaps long ago rejected by science) is this...

    Suppose that our universe, going back to some point such as the big bang was one of infinite universes (in multiverse sense I suppose), BUT that from the beginning of this universe EVERYTHING was driven by simple cause and effect since those initial conditions. Thus there would be no need to explain entanglement, spooky action at a distance, and no superposition. In fact, all those things would be evidence of a super-deterministic universe and evidence against some people's overly broad ideas of 'free will', and evidence against any true randomness at all within an existing universe, at least since some point in the past. Those circumstances would appear just like what we see (proponents might reasonably suggest).

    It would be an enormously unpopular idea but remarkably simple and seemingly plausible requiring only the acceptance of a few basic ideas (1) truly infinite history with truly infinite instances of what we think of as universes (2) universes' initial conditions (only) being the result of some chaotic 'random' process after which from start to end entirely driven by cause and effect and (3) weak anthropic principle. I feel these are intellectually easy to accept but highly provocative to human sensibilities, much like evolution.

    Doubters could say it was unprovable, but proponents might argue that non-locality IS the evidence and perhaps proof. And proponents might say that concepts like spontaneous randomness and sentient free will are the more outrageous ideas that should not be presumed without proof. In fact, we see cause and effect around us everywhere and accept it universally on a macro scale, and we also see that complex interactions of large numbers of things even on a macro scale (consider the motions of stars in a galaxy) appear random due to incalculable interacting physical processes. Without spooky randomness and spooky choice, there'd be no need to struggle with spooky non-locality. ...or so proponents might suggest.

    How do experts regard this viewpoint? Is it naive? long disproven? inconsistent with evidence? or simply considered philosophical? Has it been investigated and rejected? Or brushed aside as scientifically unsatisfying? Or is it simply ignorant? It seems that someone respectable would be a proponent if it wasn't easily refutable.