Closing the Detection Loophole Over a Meter
JQI/UMD researchers have increased by five orders of magnitude the distance over which a highly stringent test of a key quantum-mechanical principle can be successfully conducted. In doing so, Chris Monroe and colleagues* validated a technique that could eventually lead to final resolution of a 70-year-old debate over the nature of physical reality that pitted Albert Einstein against Niels Bohr.
Quantum mechanics allows a condition called “entanglement”: Under certain circumstances, the states of two objects are so utterly interdependent that if a measurement is made on one, the state of the other is known -- even if the objects are separated by vast distances.
Einstein believed this description of reality was incomplete, and that somehow each particle carried information along with it in a form not yet discovered. This argument remained largely philosophical until 1965, when physicist John Bell showed that, if Einstein was right, measurements made on pairs of particles -- such as the polarization of photons or the spin of electrons -- would have tosatisfy certain mathematical conditions now known as the( ”Bell Inequality Violation with Two Remote Atomic Qubits,” D.N. Matsukevitch, P. Maunz, D.L. Moehring, S. Olmschenk and C. Monroe, Phys. Rev Letters 100, 150404 (2008)) Bell Inequalities. Suddenly, it was possible to test Einstein’s hypothesis, and experiments began to confirm the predicitons of quantum mechanics.
However, the experimental results were subject to two kinds of “loopholes” that prevented the findings from being conclusive. One was the “locality loophole”: Unless measurements were made so fast that no information could have traveled between the entangled objects during the experiment, then the results were subject to doubt. Most observers regard that issue as settled, owing to highly sophisticated experiments over the past decades.
That leaves the “detection” loophole: If the efficiency of detection of the entangled quantum states was not high enough, or if only a selected sample of all events was recorded, it was always possible that the data represented a skewed subset of results, and not necessarily the actual physical facts. The first experiment to close this loophole was done in 2000 with two beryllium ions spaced 3 micrometers apart.
Monroe’s group detected entanglement between two trapped ytterbium ions in separate enclosures a full meter apart, as heralded by entanglement of two photons, one emitted from each ion after the ions were excited by laser light. Thanks to the rules of quantum mechanics, if it is impossible to tell which ion each photon came from, they are entangled. And so, therefore, are the ions that emitted them.
Every time the detectors registered an entangled pair of photons, the experiment immediately took a reading on the state of the ions (by rotation with a microwave pulse) to verify their entanglement, eliminating the possibility of selective sampling problems. The result, after accounting for sources of error, was 81% fidelity -- more than enough to firmly close the detection loophole.
WIth an even larger separation between the ions, or faster detection, the group concluded, “the technique demonstrated here may ultimately allow for a loopholefree Bell inequality test.”