To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We trace how a theorist would eventually discover Special Relativity as an inevitable consequence of the Maxwell theory, as was probably the case with pioneers, including Einstein. After rewriting the Maxwell equations in a manifestly relativistic form, we arrive at the Lorentz transformation and the relativistic free particles. Along the way, we bypass much of the confusing discussion of Lorentz contraction, time-dilation, and the so-called Twin Paradox, focusing on the proper time as the only absolute measure of time.
Canonical quantization of matter fields admits a surprisingly simple extension into curved spacetime as long as there exists a suitable time foliation. The main conceptual difficulty arises when multiple time foliations compete, with nontrivial Bogoliubov transformations mixing up the notions of particle and antiparticle. With the Minkowski spacetime written in the Rindler coordinates as a prototype, we explore how various distinct vacua appear and how to choose one based on physics considerations. For eternal black hole geometry, smooth event horizons demand the Hartle–Hawking vacuum, while, for black holes made from gravitational collapse, the radiation vacuum of Hawking naturally emerges. After a brief stop on black hole thermodynamics, we close the volume with a simple observation of how all these are connected to the primordial density perturbation of the cosmic inflation scenario.
We finally come to the question of why the black hole horizon is said to allow only one-way traffic. When viewed from the Kruskal coordinates, suitable for freely falling observers, the horizon consists of several distinct causal components. The future event horizon is the one we usually refer to when describing the one-way nature of the black hole geometry; its “past” cousin allows the opposite flow of trajectories but is often an artifact of the “eternal” geometry. We derive and display Penrose diagrams for many of the solutions accumulated so far and offer cautionary tales on causal structures and singularities.
This chapter introduces topological quantum computation (TQC), a model using non-Abelian anyons, specifically Fibonacci anyons, for information processing via braiding operations. The braid group and fusion rules are central to TQC, enabling operations that remain robust against certain environmental errors. TQC provides inherent fault tolerance, reducing susceptibility to local disturbances. The chapter concludes by examining the challenges and future potential of topological models, marking TQC as a promising, albeit complex, path toward scalable and robust quantum computing solutions.
Although the geometry background from Chapter 3 to the early part of Chapter 6 is self-contained, we attach this appendix to make contact with the more modern language of differential geometry. Fiber bundles can be seen as an obvious generalization of (co-)tangent bundles and allow us to introduce the notion of connections and curvatures in a more invariant manner via the principal bundle. This, in turn, leads to the frame bundle and the spinor bundle, which were implicitly invoked in Chapter 5. A quick overview of G-structure and holonomy classification is followed by how one must deal with spinors in curved spacetime. Although this last part is not used in this volume, it would become an essential tool in the companion book “Geometric Quantum Field Theories.”
The simplest class of solutions to the Einstein equation is that of the expanding universe with homogeneous and isotropic spatial slices. This chapter covers the most basic aspects of the resulting FLRW cosmology, with most examples centered upon the flat spatial slices. After a standard treatise on the expansion of the universe, dominated by ideal fluids, we turn to various puzzles of old-fashioned Big Bang cosmology that all revolve around causality and the initial condition. These puzzles are addressed handsomely by the cosmic inflation scenario that wipes out the initial data, repopulates the universe with matter and radiation, and then also seeds the primordial density perturbation. One puzzle that survives this reinitialization is that of dark energy, and we close with various opinions on the latter, including a Keplerian evasion of the problem via the cosmological landscape.
This chapter delves into topological order, a phase of matter with implications for quantum computation. The ℤ2 toric code model is introduced, using lattice arrangements of qubits to demonstrate topological protection against errors. Anyons, particles exhibiting unique exchange statistics, are utilized for encoding information through braiding operations. Surface codes are discussed as practical implementations of topological error correction, leveraging topological entanglement entropy to protect quantum information. This approach provides a highly resilient framework for quantum error correction, essential for developing fault-tolerant quantum computers with intrinsic stability against certain types of errors.
The Einstein equation is reproduced by Hilbert’s action principle, with the action as a functional of metric or as a functional of metric and connection. We list three related approaches, distinct in detail, with the common outcome of the Einstein equation. One unusual aspect of this action principle is the introduction of the Gibbons–Hawking–York boundary term. We give a detailed description of the extrinsic curvature for this purpose and derive the boundary term. The action principle is advantageous in that it produces a clean derivation of the symmetric energy–momentum tensor on the right-hand side of the Einstein equation. The last part of this chapter addresses how this Hilbert energy–momentum tensor of the Einstein equation is inevitably the same as the Noether one, contrary to popular lore.
Once a differentiable manifold is given, one can equip it with the affine connection or the covariant derivative. The further structure of the metric to be preserved by the affine connection, favors the Levi-Civita connection, which is often expressed via the Christoffel symbols, already encountered for the relativistic particle mechanics. This, in turn, defines the Riemann curvature tensor and the Ricci tensor. Numerous additional structures that follow the covariantly constant metric are introduced, such as raising and lowering of indices, Killing vector fields, the volume form, the Hodge star map, geodesics, and geodesic normal coordinates.
This chapter examines quantum decoherence, a process by which quantum information is lost due to environmental interactions. Various noise channels, such as bit-flip, phase-flip, and depolarizing channels, are discussed to illustrate common errors in qubit states. The Kraus representation and Lindblad equation offer frameworks for modeling these interactions. Metrics such as T1 (relaxation time) and T2 (decoherence time) are introduced to measure qubit stability. Understanding decoherence mechanisms is critical for developing strategies to preserve quantum information, laying the groundwork for quantum error correction techniques and highlighting the challenges in creating reliable quantum systems.
This chapter covers quantum error correction, essential for preserving quantum information in the presence of noise. It introduces the bit-flip and phase-flip codes as foundational error-correction methods, building toward Shor’s code, which corrects general single-qubit errors. Logical qubits are formed by encoding physical qubits to maintain stability. Stabilizer codes are presented as a systematic framework for error correction, enabling fault-tolerant quantum computing. These principles are crucial for creating scalable quantum systems that can perform reliable computations, even in noisy environments, addressing a central challenge in quantum computing’s practical implementation.
This chapter explores classical computation fundamentals, starting with Turing machines as a foundation for defining computability. The universal Turing machine is introduced, emphasizing the theoretical basis for all computable functions. Computational complexity is discussed, differentiating between tractable and intractable problems and explaining complexity classes as a framework for problem-solving. The chapter also covers the circuit model, providing a bridge between theoretical constructs and modern computer architecture. Finally, the concept of reversible computation is introduced, which has implications for energy-efficient processing. Through these topics, the chapter delineates classical computation’s limitations, setting up the motivation to transition into quantum approaches in subsequent chapters.