This article will be permanently flagged as inappropriate and made unaccessible to everyone. Are you certain this article is inappropriate? Excessive Violence Sexual Content Political / Social
Email Address:
Article Id: WHEBN0000883494 Reproduction Date:
In Bell test experiments, there may be problems of experimental design or set-up that affect the validity of the experimental findings. These problems are often referred to as "loopholes". See the article on Bell's theorem for the theoretical background to these experimental efforts (see also J.S. Bell). The purpose of the experiment is to test whether nature is best described using a local hidden variable theory or by the quantum entanglement theory of quantum mechanics.
The "detection efficiency", or "fair sampling" problem is the most prevalent loophole in optical experiments. Another loophole that has more often been addressed is that of communication, i.e. locality. There is also the "disjoint measurement" loophole which entails multiple samples used to obtain correlations as compared to "joint measurement" where a single sample is used to obtain all correlations used in an inequality. To date, no test has simultaneously closed all loopholes.
Ronald Hanson of Delft University of Technology claims the first Bell experiment that closes both the detection and the communication loopholes.^{[1]} (This was not an optical experiment in the sense discussed below; the entangled degrees of freedom were electron spins rather than photon polarization.) Nevertheless, correlations of classical optical fields also violate Bell's inequality.^{[2]}
In some experiments there may be additional defects that make "local realist" explanations of Bell test violations possible;^{[3]}^{[4]} these are briefly described below.
Many modern experiments are directed at detecting quantum entanglement rather than ruling out local hidden variable theories, and these tasks are different since the former accepts quantum mechanics at the outset (no entanglement without quantum mechanics). This is regularly done using Bell's theorem, but in this situation the theorem is used as an entanglement witness, a dividing line between entangled quantum states and separable quantum states, and is as such not as sensitive to the problems described here. In October 2015, scientists from the Kavli Institute of Nanoscience reported that the quantum entanglement phenomenon is strongly supported based on a "loophole-free Bell test" study.^{[5]}^{[6]}
In Bell test experiments, one problem is that detection efficiency may be less than 100%, and this is always the case in optical experiments. This problem was noted first by Pearle in 1970,^{[7]} and Clauser and Horne (1974) devised another result intended to take care of this. Some results were also obtained in the 1980s but the subject has undergone significant research in recent years. The many experiments affected by this problem deal with it, without exception, by using the "fair sampling" assumption (see below).
This loophole changes the inequalities to be used; for example the CHSH inequality:
is changed. When data from an experiment is used in the inequality one needs to condition on that a "coincidence" occurred, that a detection occurred in both wings of the experiment. This will change the inequality into
In this formula, the \eta denotes the efficiency of the experiment, formally the minimum probability of a coincidence given a detection on one side.^{[8]}^{[9]} In Quantum mechanics, the left-hand side reaches 2\sqrt{2}, which is greater than two, but for a non-100% efficiency the latter formula has a larger right-hand side. And at low efficiency (below 2(\sqrt{2}-1)≈83%), the inequality is no longer violated.
All optical experiments are affected by this problem, having typical efficiencies around 5-30%. Several non-optical systems such as trapped ions,^{[10]} superconducting qubits^{[11]} and NV centers^{[12]} have been able to bypass the detection loophole. Unfortunately, they are all still vulnerable to the communication loophole.
There are tests that are not sensitive to this problem, such as the Clauser-Horne test, but these have the same performance as the latter of the two inequalities above; they cannot be violated unless the efficiency exceeds a certain bound. For example, in the Clauser-Horne test, the bound is ⅔≈67% (Eberhard, 199X; Larsson, 2000).
Usually, the fair sampling assumption (alternatively, the "no-enhancement assumption") is used in regard to this loophole. It states that the sample of detected pairs is representative of the pairs emitted, in which case the right-hand side in the equation above is reduced to 2, irrespective of the efficiency. This comprises a third postulate necessary for violation in low-efficiency experiments, in addition to the (two) postulates of local realism. There is no way to test experimentally whether a given experiment does fair sampling, as the number of emitted but undetected pairs is by definition unknown.
In many experiments the electronics are such that simultaneous + and – counts from both outputs of a polariser can never occur, only one or the other being recorded. Under quantum mechanics, they will not occur anyway, but under a wave theory the suppression of these counts will cause even the basic realist prediction to yield unfair sampling. However, the effect is negligible if the detection efficiencies are low.
The Bell inequality is motivated by the absence of communication between the two measurement sites. In experiments, this is usually ensured simply by prohibiting any light-speed communication by separating the two sites and then ensuring that the measurement duration is shorter than the time it would take for any light-speed signal from one site to the other, or indeed, to the source. In one of Alain Aspect's experiments, inter-detector communication at light speed during the time between pair emission and detection was possible, but such communication between the time of fixing the detectors' settings and the time of detection was not. An experimental set-up without any such provision effectively becomes entirely "local", and therefore cannot rule out local realism. Additionally, the experiment design will ideally be such that the settings for each measurement are not determined by any earlier event, at both measurement stations.
John Bell supported Aspect's investigation of it^{[13]}(p. 109) and had some active involvement with the work, being on the examining board for Aspect’s PhD. Aspect improved the separation of the sites and did the first attempt on really having independent random detector orientations. Weihs et al. improved on this with a distance on the order of a few hundred meters in their experiment in addition to using random settings retrieved from a quantum system.^{[14]} Scheidl et al. (2010) improved on this further by conducting an experiment between locations separated by a distance of 144 km.^{[15]}
John Bell assumed observations are obtained with a common hidden variable 'lambda'. However, 2-particle experiments violate that assumption. To estimate the correlation when the two measurement devices have parameters 'a' and 'b', a sample (of observations) is taken. To estimate the correlation when the devices have parameters 'a' and 'c', a second sample is taken. To estimate the correlation when the devices have parameters 'b' and 'c', a third sample is taken. Those three correlations are then used in Bell's original inequality and found to violate that inequality. But the statistics of sequential (disjoint) samples are different from the statistics of a single (joint) sample where all of the parameters 'a', 'b', and 'c' are set once and not changed. That condition can only be met if there are 3-particles (not 2). Bell's 3-parameter inequality holds without ambiguity for 3-particles measured jointly. 3-particle joint correlations inserted into Bell's inequality will not violate the inequality. Using disjoint correlations in joint inequalities is claimed to be the cause of inequality violation.
The source is said to be "rotationally invariant" if all possible hidden variable values (describing the states of the emitted pairs) are equally likely. The general form of a Bell test does not assume rotational invariance, but a number of experiments have been analysed using a simplified formula that depends upon it. It is possible that there has not always been adequate testing to justify this. Even where, as is usually the case, the actual test applied is general, if the hidden variables are not rotationally invariant this can result in misleading descriptions of the results. Graphs may be presented, for example, of coincidence rate against the difference between the settings a and b, but if a more comprehensive set of experiments had been done it might have become clear that the rate depended on a and b separately. Cases in point may be Weihs' experiment (Weihs, 1998),^{[14]} presented as having closed the locality loophole, and Kwiat’s demonstration of entanglement using an “ultrabright photon source” (Kwiat, 1999).^{[16]}
In many experiments, especially those based on photon polarization, pairs of events in the two wings of the experiment are only identified as belonging to a single pair after the experiment is performed, by judging whether or not their detection times are close enough to one another. This generates a new possibility for a local hidden variables theory to "fake" quantum correlations: delay the detection time of each of the two particles by a larger or smaller amount depending on some relationship between hidden variables carried by the particles and the detector settings encountered at the measurement station. This loophole was noted by A. Fine in 1980 and 1981, by S. Pascazio in 1986, and by J. Larsson and R. D. Gill in 2004. It turns out to be more serious that the detection loophole in that it gives more room for local hidden variables to reproduce quantum correlations, for the same effective experimental efficiency: the chance that particle 1 is accepted (coincidence loophole) or measured (detection loophole) given that particle 2 is detected.
The coincidence loophole can be ruled out entirely simply by working with a pre-fixed lattice of detection windows which are short enough that most pairs of events occurring in the same window do originate with the same emission and long enough that a true pair is not separated by a window boundary.
In most experiments, measurements are repeatedly made at the same two locations. Under local realism, there could be effects of memory leading to statistical dependence between subsequent pairs of measurements. Moreover, physical parameters might be varying in time. It has been shown that, provided each new pair of measurements is done with a new random pair of measurement settings, that neither memory nor time inhomogeneity cannot have a serious effect on the experiment.^{[17]}
In the case of Bell test experiments, if there are sources of error (that are not accounted for by the experimentalists) that might be of enough importance to explain why a particular experiment gives results in favor of quantum entanglement as opposed to local realism, they are called loopholes. Here some examples of existing and hypothetical experimental errors are explained. There are of course sources of error in all physical experiments. Whether or not any of those presented here have been found important enough to be called loopholes, in general or because of possible mistakes by the performers of some known experiment found in literature, is discussed in the subsequent sections. There are also non-optical Bell test experiments, which are not discussed here.
As a basis for our description of experimental errors let us consider a typical experiment of CHSH type (see picture to the right). In the experiment the source is assumed to emit light in the form of pairs of particle-like photons with each photon sent off in opposite directions. When photons are detected simultaneously (in reality during the same short time interval) at both sides of the "coincidence monitor" a coincident detection is counted. On each side of the coincidence monitor there are two inputs that are here named the "+" and the "-" input. The individual photons must (according to quantum mechanics) make a choice and go one way or the other at a two-channel polarizer. For each pair emitted at the source ideally either the + or the - input on both sides will detect a photon. The four possibilities can be categorized as ++, +−, −+ and −−. The number of simultaneous detections of all four types (hereinafter N++, N+-, N-+ and N--) is counted over a timespan covering a number of emissions from the source. Then the following is calculated:
(1) E = (N++ + N-- − N+- − N-+)/(N++ + N-- + N+- + N-+).
This is done with polarizer a rotated into two positions a and a′, and polarizer b into two positions b and b′, so that we get E(a,b),E(a,b′),E(a′,b) and E(a′,b′). Then the following is calculated:
(2) S = E(a, b) − E(a, b′) + E(a′, b) + E(a′ b′)
Entanglement and local realism give different predicted values on S, thus the experiment (if there are no substantial sources of error) gives an indication to which of the two theories better corresponds to reality.
The principal possible errors in the light source are:
The experiment requires choice of the detectors' orientations. If this free choice were in some way denied then another loophole might be opened, as the observed correlations could potentially be explained by the limited choices of detector orientations. Thus, even if all experimental loopholes are closed, superdeterminism may allow the construction of a local realist theory that agrees with experiment.^{[18]}
Classical mechanics, Energy, Quantum field theory, Albert Einstein, Electromagnetism
Quantum mechanics, Interpretations of quantum mechanics, Quantum field theory, Hidden variable theory, Classical mechanics
Quantum mechanics, Sine, Hidden variable theory, John Stewart Bell, Superdeterminism
Quantum mechanics, Interpretations of quantum mechanics, Hidden variable theory, John Stewart Bell, Clauser and Horne's 1974 Bell test
Hidden variable theory, Alice and Bob, Quantum mechanics, EPR paradox, Quantum entanglement
Quantum mechanics, Albert Einstein, Information, Metaphysics, Physics