To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In recent years there has been a huge increase in the research and development of nanoscale science and technology. Central to the understanding of the properties of nanoscale structures is the modeling of electronic conduction through these systems. This graduate textbook provides an in-depth description of the transport phenomena relevant to systems of nanoscale dimensions. In this textbook the different theoretical approaches are critically discussed, with emphasis on their basic assumptions and approximations. The book also covers information content in the measurement of currents, the role of initial conditions in establishing a steady state, and the modern use of density-functional theory. Topics are introduced by simple physical arguments, with particular attention to the non-equilibrium statistical nature of electrical conduction, and followed by a detailed formal derivation. This textbook is ideal for graduate students in physics, chemistry, and electrical engineering.
The first realization that the validity of the quantum superposition principle in the Hilbert space describing a composite quantum system may give rise to fundamentally new correlations between the constituent subsystems came in the landmark 1935 paper by Einstein, Podolsky, and Rosen (EPR), where it was shown how the measurement statistics of observables in certain quantum states could not be reproduced by assigning definite wavefunctions to individual subsystems. It was in response to the EPR paper that Schrödinger, in the same year, coined the term entanglement (Verschränkung) to acknowledge the failure of classical intuition in describing the relationship between the “parts” and the “whole” in the quantum world:
Whenever one has a complete expectation catalog – a maximum total knowledge – a ψ function – for two completely separated bodies, …then one obviously has it also for the two bodies together. But the converse is not true. The best possible knowledge of a total system does not necessarily include total knowledge of all its parts, not even when these are fully separated from each other and at the moment are not influencing each other at all.
While Bell's strengthening of the original EPR-paradox setting and the subsequent experimental verification of Bell inequalities irreversibly changed the perception of entanglement from a property of counterintuitive “spookiness” to (beyond reasonable doubt) an experimental reality, the concept and implications of entanglement continue to be associated with a host of physical, mathematical, and philosophical challenges. In particular, investigation of entanglement in both its qualitative and quantitative aspects has intensified under the impetus of quantum information science (QIS).
Discussions of quantum-computational algorithms in the literature refer to various features of quantum mechanics as the source of the exponential speed-up relative to classical algorithms: superposition and entanglement, the fact that the state space of n bits is a space of 2n states while the state space of n qubits is a space of 2n dimensions, the possibility of computing all values of a function in a single computational step by “quantum parallelism,” or the possibility of an efficient implementation of the discrete quantum Fourier transform. Here I propose a different answer to the question posed in the title, in terms of the difference between classical logic and quantum logic, i.e., the difference between the Boolean classical event structure and the non-Boolean quantum event structure. In a nutshell, the ultimate source of the speed-up is the difference between a classical disjunction, which is true (or false) in virtue of the truth values of the disjuncts, and a quantum disjunction, which can be true (or false) even if none of the disjuncts is either true or false.
In the following, I will discuss the information-processing in Deutsch's XOR algorithm (the first genuinely quantum algorithm) and related period-finding quantum algorithms (Simon's algorithm and Shor's factorization algorithm). It is well known that these algorithms can be formulated as solutions to a hidden-subgroup problem. Here the salient features of the information-processing are presented from the perspective of the way in which the algorithms exploit the non-Boolean logic represented by the projective geometry (the subspace structure) of Hilbert space.
Indefinite causal structure poses particular problems for theory formulation since many of the core ideas used in the usual approaches to theory construction depend on having definite causal structure. For example, the notion of a state across space evolving in time requires that we have some definite causal structure so we can define a state on a space-like hypersurface. We will see that many of these problems are mitigated if we are able to formulate the theory in a formalism-local (or F-local) fashion. A formulation of a physical theory is said to be F-local if, in making predictions for any given arbitrary space-time region, we need only refer to mathematical objects pertaining to that region. This is a desirable property both on the grounds of efficiency and since, if we have indefinite causal structure, it is not clear how to select some other space-time region on which our calculations may depend. The usual ways of formulating physical theories (the time-evolving state picture, the histories approach, and the local-equations approach) are not F-local.
We set up a framework for probabilistic theories with indefinite causal structure. This, the causaloid framework, is F-local. We describe how quantum theory can be formulated in the causaloid framework (in an F-local fashion). This provides yet another formulation of quantum theory. This formulation, however, may be particularly relevant to the problem of finding a theory of quantum gravity. The problem of quantum gravity is to find a theory that reduces in appropriate limits to general relativity and quantum theory (including, at least, those situations in which those two theories have been experimentally confirmed).
The use of parameters to describe an experimenter's control over the devices used in an experiment is familiar in quantum physics, for example in connection with Bell inequalities. Parameters are also interesting in a different but related context, as we noticed when we proved a formal separation in quantum mechanics between linear operators and the probabilities that these operators generate. In comparing an experiment against its description by a density operator and detection operators, one compares tallies of experimental outcomes against the probabilities generated by the operators but not directly against the operators. Recognizing that the accessibility of operators to experimental tests is only indirect, via probabilities, motivates us to ask what probabilities tell us about operators, or, put more precisely, “what combinations of a parameterized density operator and parameterized detection operators generate any given set of parametrized probabilities?”
Here, we review and augment recent proofs that any given parameterized probabilities can be generated in very diverse ways, so that a parameterized probability measure, detached from any of the (infinitely many) parameterized operators that generate it, becomes an interesting object in its own right. By detaching a parameterized probability measure from the operators that may have led us to it, we (1) strengthen Holevo's bound on a quantum communication channel and (2) clarify a role for multiple levels of modeling in an example based on quantum key distribution. We then inquire into some parameterized probability measures generated by entangled states and into the topology of the associated parameter spaces; in particular we display some previously overlooked topological features of level sets of these probability measures.
In many situations, learning from the results of measurements can be regarded as updating one's probability distributions over certain variables. According to Bayesians, this updating should be carried out according to the rule of conditionalization. In the theory of quantum mechanics, there is a rule that tells us how to update the state of a system, given observation of a measurement result. The state of a quantum system is closely related to probability distributions over potential measurements. Therefore we might expect there to be some relation between Bayesian conditionalization and the quantum state-update rule. There have been several suggestions that the state change just is Bayesian conditionalization, appropriately understood, or that it is closely analogous.
Bub was the first to make the connection between quantum measurement and Bayesian conditionalization in a 1977 paper, using an approach based on quantum logic. The connection is renewed in discussions by Fuchs and also Jacobs in 2002, where again the analogy between the quantum state update and Bayesian conditionalization is pointed out. At the same time, Fuchs draws attention to a disanalogy – namely that there is an “extra unitary” transformation as part of the measurement in the quantum case. In this chapter, I will first review the proposals of Bub, Jacobs, and Fuchs. I will then show that the presence of the extra unitaries in quantum measurement leads to a difference between classical and quantum measurement in terms of information gain, drawing on results by Nielsen and Fuchs and Jacobs.
For Abner Shimony. Your influence on me goes well beyond physics. Knowing you and being close to you is one of the greatest privileges and pleasures in my life.
Introduction
Quantum mechanics is, without any doubt, a tremendously successful theory: it started by explaining black-body radiation and the photoelectric effect, it explained the spectra of atoms, and then went on to explain chemical bonds, the structure of atoms and of the atomic nucleus, the properties of crystals and the elementary particles, and a myriad of other phenomena. Yet it is safe to say that we still lack a deep understanding of quantum mechanics – surprising and even puzzling new effects continue to be discovered with regularity. That we are surprised and puzzled is the best sign that we still don't understand; however, the veil over the mysteries of quantum mechanics is starting to lift a little.
One of the strangest things microscopic particles do is to follow non-local dynamics and to yield non-local correlations. That particles follow non-local equations of motion was discovered by Aharonov and Bohm, while non-local correlations – which are the subject of this chapter – were discovered by John Bell and first cast in a form that has physical meaning, i.e., that can be experimentally tested, by Clauser, Horne, Shimony, and Holt. When they were discovered, both phenomena seemed to be quite exotic and at the fringe of quantum mechanics. By now we understand that they are some of the most important aspects of quantummechanical behavior.
Quantum information science is about the processing of information by the exploitation of some distinguishing features of quantum systems, such as electrons, photons, ions. In recent years a lot has been promised in the domain of quantum information. In quantum computing it was promised that NP-problems would be solved in polynomial time. In quantum cryptography there were claims that protocols would have practically 100% security. At the moment it is too early to say anything definitive regarding the final results of this great project.
In quantum computing a few quantum algorithms and developed devices, “quantum pre-computers” with a few quantum registers, were created. However, difficulties could no longer be ignored. For some reason it was impossible to create numerous quantum algorithms that could be applied to various problems. Up to now the whole project is based on two or three types of algorithm, and among them one, namely, the algorithms for prime factorization, might be interesting for real-world application. There is a general tendency to consider this situation with quantum algorithms as an occasional difficulty. But, as the years pass, one might start to think that there is something fundamentally wrong. The same feelings are induced by developments in quantum hardware. It seems that the complexity of the problem of creation of a device with a large number N of quantum registers increases extremely non-linearly with increasing N. In quantum cryptography the situation is opposite to that of quantum computing. There were tremendous successes in the development of technologies for production and transmission of quantum information, especially pairs of entangled photons.
Quantum information theory is the study of how the peculiar features of quantum mechanics can be exploited for the purposes of information processing and transmission. A central theme of such a study is the ways in which quantum mechanics opens up possibilities that go beyond what can be achieved classically. This has in turn led to a renewed interest in, and a new perspective on, the differences between the classical and the quantum. Although much of the work along these lines has been motivated by quantum information theory – and some of it has been motivated by the conviction that quantum theory is essentially about possibilities of information processing and transmission – the results obtained, and the frameworks developed, have interest even for those of us who are not of that conviction. Indeed, much of the recent work echoes, and builds upon, work that predates the inception of quantum information theory. The significance of such work extends beyond the setting of quantum information theory; the work done on distinguishing the quantum from the classical in the context of frameworks that embrace both is something worthy of the attention of anyone interested in the foundational issues surrounding quantum theory.
One of the striking features of quantum mechanics lies in its probabilistic character. A quantum state yields, not a definite prediction of the outcome of an experiment, but a probability measure on the space of possible outcomes. Of course, probabilities occur also in a classical context.
We live, we are told, in an information age. We are told this, perhaps, less often than once we were; but no doubt only because the phrase has become worn from use. If ours is an age of information, then quantum information theory is a field propitiously in tune with the spirit of the times: a rich and sophisticated physical theory that seeks to tame quantum mysteries (no less!) and turn them to ingenious computational and communication ends. It is a theory that hints, moreover, at the possibility of finally rendering the quantum unmysterious; or at least this is a conclusion that many have been tempted to draw.
Yet, for all its timeliness, some of the most intriguing of the prospects that quantum information science presents are to be found intertwining with some surprisingly old and familiar philosophical themes. These themes are immaterialism and instrumentalism; and in this chapter we shall be exploring how these old ideas feature in the context of two of the most tantalizing new questions that have arisen with the advent of this field. Does quantum information theory finally help us to resolve the conceptual conundrums of quantum mechanics? And does the theory indicate a new way of thinking about the world – one in which the material as the fundamental subject matter of physical theory is seen to be replaced by the immaterial: information?
Many philosophers and physicists have expressed great hope that quantum information theory will help us understand the nature of the quantum world. The general problem is that there is no widespread agreement on what quantum information is. Hence, such pronouncements regarding quantum information theory as the savior of the philosophy of physics are hard to evaluate. Much work has been done producing and evaluating concepts of information.
In I have defended and articulated the Schumacher concept of quantum information. Roughly speaking, quantum information is construed as the statistical behavior associated with the measurement of a quantum system. Hence it is a coarse-grained operational description of quantum systems, with no recourse to the fundamental ontological features of quantum systems responsible for such behavior. From this perspective, construing quantum mechanics as a theory of quantum information departs from the traditional interpretive endeavors of philosophers and physicists. The question is whether there is any motivation for taking such a view.
The theorem of Clifton, Bub, and Halvorson (CBH) provides just such a motivation. The theorem guarantees that, if a theory T satisfies certain conditions, there will exist an empirically equivalent C*-algebraic theory that has a concrete representation in Hilbert space, which it is notoriously difficult to interpret as a constructive or mechanical theory. In such a case, any underlying ontologies philosophers develop that are compatible with T will be undermined by the C*-algebraic equivalent. Bub suggests in light of this in-principle uncertainty regarding ontology that we re-conceive of quantum mechanics as a theory about quantum information.
Recently there has emerged an exciting and rapidly growing field of research known as quantum information theory. This interdisciplinary field is unified by the following two goals: first, the possibility of harnessing the principles and laws of quantum mechanics to aid in the acquisition, transmission, and processing of information; and second, the potential that these new technologies have for deepening our understanding of the foundations of quantum mechanics and computation. Many of the new technologies and discoveries emerging from quantum information theory are challenging the adequacy of our old concepts of entanglement, non-locality, and information. This research suggests that the time is ripe for a reconsideration of the foundations – and philosophical implications – of quantum information theory.
Historically, apart from a small group of physicists working on foundational issues, it was philosophers of physics who recognized the importance of the concepts of entanglement and non-locality long before the mainstream physics community. Prior to the 1980s, discussions of the infamous “EPR” paper and John Bell's seminal papers on quantum non-locality were carried out more often by such philosophers than by ordinary physicists. In the 1990s that situation rapidly changed, once the larger community of physicists had begun to realize that entanglement and non-locality were not just quirky features of quantum mechanics, but physical resources that could be harnessed for the performance of various practical tasks. Since then, a large body of literature has emerged in physics, revealing many new dimensions to our concepts of entanglement and non-locality, particularly in relation to information. Regrettably, however, only a few philosophers have followed these more recent developments, and many philosophical discussions still end with Bell's work.