But this conclusion [nonlocality] needs careful discussion in order to clarify what is going on.
Within the foundations of physics in recent years, Bell's theorem has played the role of what Thomas Kuhn calls a ‘paradigm’: that is, an exemplary piece of work that others learn from, imitate and develop. Following a period of articulation and consolidation, the first generation of developments of the Bell theorem was initiated by Heywood and Redhead (1983). They produced a nonlocality result in the algebraic style of the Bell–Kochen–Specker theorem (Bell 1966; Kochen and Specker 1967), moving away from the probabilistic relations characteristic of the Bell theorems proper. More recently a second generation develops results by Peres (1990), Greenberger–Horne–Zeilinger (1990), and Hardy (1993). In addition to moving away from probabilities, this generation tries to dispense with the limiting inequalities of the Bell theorem to yield socalled ‘Bell theorems without inequalities’. With respect to probabilities, however, Hardy is a half-way house. It requires no inequalities but the result contradicts quantum mechanics under certain locality assumptions only if the statistical predictions of quantum mechanics hold in at least one case.
I want to examine the Hardy theorem and its interpretation. Initially, I intend to ignore respects in which it dispenses with probabilities because I want to point out the interesting significance of the theorem in a probabilistic context. We will see that when probabilities are restored, so are inequalities. Then we will see what the theorem has to contribute on the topic of locality.
Email your librarian or administrator to recommend adding this book to your organisation's collection.