To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Computation is a process of performing a large number of simple operations. The main building blocks of classical computers are bits. Each of them can take either the value 0 or 1. A series of gates can change these values by acting on them selectively. This process is structured to fulfil a purpose. For example, if we are interested in adding two numbers, the numbers are encoded into bits and gates are performed such that the final state of the bits reveals the desired answer: a number which is the sum of the initial ones. While it is possible to build algorithms that can perform almost any computation, it is also desirable to find the answer within a reasonable length of time or with a reasonable amount of resources. There is an ever-growing demand in computational power for both scientific and commercial purposes. Indeed, it is surprisingly easy to find an application that can jam even the fastest computer. This fuels a vast effort in the research of information science to increase the speed and processing power of computers.
Modern computational models are based on the universal Turing machine (Turing, 1937). This is a theoretical information processing model that employs the elementary gate processes described above. It can efficiently simulate any other device capable of performing an algorithmic process. Since the introduction of the Turing machine, physics has influenced computation in many ways.
In this chapter, we consider Kitaev's honeycomb lattice model (Kitaev, 2006). This is an analytically tractable spin model that gives rise to quasiparticles with Abelian as well as non-Abelian statistics. Some of its properties are similar to the fractional quantum Hall effect, which has been studied experimentally in great detail even though it evades exact analytical treatment (Moore and Read, 1991). Due to its simplicity, the honeycomb lattice model is likely to be the first topological spin model to be realised in the laboratory, e.g., with optical lattice technology (Micheli et al., 2006). Understanding its properties can facilitate its physical realisation and can provide a useful insight into the mechanisms underlining topological insulators and the fractional quantum Hall effect.
The honeycomb lattice model comprises interacting spin-½ particles arranged on the sites of a honeycomb lattice. It is remarkable that such a simple model can support a rich variety of topological behaviours. For certain values of its couplings, Abelian anyons emerge that behave like the toric code anyons. For another coupling regime, non-Abelian anyons emerge that correspond to the Ising anyons. The latter are manifested as vortex-like configurations of the original spin model that can effectively be described by Majorana fermions. These are fermionic fields that are antiparticles of themselves. They were first introduced in the context of high-energy physics (Majorana, 1937) and become increasingly important in the analysis of solid state phenomena (Wilczek, 2009).
Topological quantum computation encodes and manipulates information by exclusively employing anyons. To study the computational power of anyons we plan to look into their fusion and braiding properties in a systematic way. This will allow us to identify a Hilbert space, where quantum information can be encoded fault-tolerantly. We also identify unitary evolutions that serve as logical gates. It is an amazing fact that fundamental properties, such as particle statistics, can be employed to perform quantum computation. As we shall see below, the resilience of these intrinsic particle properties against environmental perturbations is responsible for the fault-tolerance of topological quantum computation.
Anyons are physically realised as quasiparticles in topological systems. Most of the quasiparticle details are not relevant for the description of anyons. This provides an additional resilience of topological quantum computation against errors in the control of the quasiparticles. In particular, the principles of topological quantum computation are independent of the underlying physical system. We therefore do not discuss its properties in this chapter. The abstraction might create a conceptual vacuum as many intrinsic properties of the system might appear to be absent. For example, we shall not be concerned with the trapping and transport of anyons or with geometrical characteristics of their evolutions. In this chapter we treat anyons as classical fundamental particles, with internal quantum degrees of freedom, much like the spin. Moreover, we assume that we have complete control over the topological system, in terms of initial-state preparation and final-state identification.
To perform topological quantum computation we first need to experimentally realise anyons in a topological system. These systems are characterised by intriguing non-local quantum correlations that give rise to the anyonic statistics. What are the diagnostic tools we have to identify if a given system is indeed topological or not? Different phases of matter are characterised by their symmetries. This information is captured by order parameters. Usually, order parameters are defined in terms of local operators that can be measured in the laboratory. For example, the magnetisation of a spin system is given as the expectation value of a single spin with respect to the ground state. Such local properties can describe fascinating physical phenomena efficiently, such as ferromagnetism, and can pinpoint quantum phase transitions.
But what about topological systems? Experimentally, we usually identify the topological character of systems, such as the fractional quantum Hall liquids, by probing the anyonic properties of their excitations (Miller et al., 2007). However, topological order should be a characteristic of the ground state (Thouless et al., 1982; Wen, 1995). The natural question arises: is it possible to identify a property of the ground state of a system that implies anyonic excitations? The theoretical background that made it possible to answer this question came from entropic considerations of simple topological models. Hamma et al. (2005) studied the entanglement entropy of the toric code ground state and noticed an unusual behaviour.
The birth of topological quantum computation took place when Alexei Kitaev (2003) made the ingenious step of turning a quantum error correcting code into a many-body interacting system. In particular, he defined a Hamiltonian whose eigenstates are also states of a quantum error correcting code. Beyond the inherited error correcting characteristics, topological systems protect the encoded information with the presence of the Hamiltonian that energetically penalises transformations between states. This opens the door for employing a large variety of many-body effects to combat errors.
Storing or manipulating information with a real physical system is naturally subject to errors. To obtain a reliable outcome from a computation we need to be certain that the processed information remains resilient to errors at all times. To overcome errors we need to detect and correct them. The error detection process is based on an active monitoring of the system and the possibility of identifying errors without destroying the encoded information. Error correction employs the error detection outcome and performs the appropriate steps to correct it, thus reconstructing the original information.
Classical error correction uses redundancy to spread information in many copies so that errors can be detected, for example by majority voting, and then corrected. Similarly, quantum error correction aims to detect and correct errors of stored quantum information. Quantum states cannot be cloned (Wootters and Zurek, 1982), so the repetition encoding cannot be employed.
The study of anyonic systems as computational means has led to the exciting discovery of a new quantum algorithm. This algorithm provides a novel paradigm that fundamentally differs from searching (Grover, 1996) and factoring (Shor, 1997) algorithms. It is based on the particular behaviour of anyons and its goal is to evaluate Jones polynomials (Jones, 1985, 2005). These polynomials are topological invariants of knots and links, i.e., they depend on the global characteristics of their strands and not on their local geometry. The Jones polynomials were first connected to topological quantum field theories by Edward Witten (1989). Since then they have found applications in various areas of research, such as biology for DNA reconstruction (Nechaev, 1996) and statistical physics (Kauffman, 1991).
The best known classical algorithm for the exact evaluation of the Jones polynomial demands exponential resources (Jaeger et al., 1990). Employing anyons involves only a polynomial number of resources to produce an approximate answer to this problem (Freedman et al., 2003b). Evaluating Jones polynomials by manipulating anyons resembles an analogue computer. Indeed, the idea is equivalent to the classical setup, where a wire is wrapped several times around a solenoid that confines magnetic flux. By measuring the current that runs through the wire one can obtain the number of times the wire was wrapped around the solenoid, i.e., their linking number. Similarly, by creating anyons and spanning links with their worldlines we are able to extract the Jones polynomials of these links (Kauffman and Lomanaco, 2006).
In the previous chapters we introduced anyons and their properties, we presented how to perform topological quantum computation and studied several examples of topological models. There is a wide variety of research topics concerned with topological quantum computation. Among the many open questions, two have a singular importance. The first natural question is: which physical systems can support non-Abelian anyons? Realising non-Abelian anyons in the laboratory is of fundamental and practical interest. Such exotic statistical behaviour has not yet been encountered in nature. The physical realisation of non-Abelian anyons would be the first step towards the identification of a technological platform for the realisation of topological quantum computation. The second question concerns the efficiency of topological systems in combating errors. It has been proven that the effect of coherent environmental errors in the form of local Hamiltonian perturbations can be suppressed efficiently without degrading the topologically encoded information (Bravyi et al., 2010). Nevertheless, there is no mechanism that can protect topological order from incoherent probabilistic errors. Topological systems nevertheless constitute a rich and versatile medium that allows imaginative proposals to be developed (Chesi et al., 2010; Hamma et al., 2009).
Regarding the first question, we can identify two main categories of physical proposals for the realisation of two-dimensional topological systems: systems that are defined on the continuum and discrete systems defined on a lattice. It is natural to ask, which are the most promising architectures to realise in the laboratory?
Symmetries play a central role in physics. They dictate what one can change in a physical system without affecting any of its properties. You might have encountered symmetries like translational symmetry, where a system remains unchanged if it is spatially translated by an arbitrary distance. A system with rotational symmetry, however, is invariant under rotations. Some symmetries, like the ones mentioned above, give information about the structure of the system. Others have to do with the more fundamental physical framework that we adopt. An example for this is the invariance under Lorentz transformations in relativistic physics.
Other types of symmetries can be even more subtle. For example, it is rather self-evident that physics should remain unchanged if we exchange two identical point-like particles. Nevertheless, this fundamental property that we call statistical symmetry gives rise to rich and beautiful physics. In three spatial dimensions it dictates the existence of bosons and fermions. These are particles with very different quantum mechanical properties. Their wave function acquires a +1 or a -1 phase, respectively, whenever two particles are exchanged. A direct consequence of this is that bosons can actually occupy the same state. In contrast, fermions can only be stacked together with each particle occupying a different state.
When one considers two spatial dimensions, a wide variety of statistical behaviours is possible. Apart from bosonic and fermionic behaviours, arbitrary phase factors, or even non-trivial unitary evolutions, can be obtained when two particles are exchanged (Leinaas and Myrheim, 1977).
A completely different interpretation of the measurement problem, one which many professional scientists have found attractive if only because of its mathematical elegance, was first suggested by Hugh Everett III in 1957 and is known variously as the ‘relative state’, ‘many-worlds’ or ‘branching-universe’ interpretation. This viewpoint gives no special role to the conscious mind and to this extent the theory is completely objective, but we shall see that many of its other consequences are in their own way just as revolutionary as those discussed in the previous chapter.
The essence of the many-worlds interpretation can be illustrated by again considering the example of the 45° polarised photon approaching the H/V detector. Remember what we demonstrated in Chapters 2 and 4: from the wave point of view a 45° polarised light wave is equivalent to a superposition of a horizontally polarised wave and a vertically polarised wave. If we were able to think purely in terms of waves, the effect of the H/V polariser on the 45° polarised wave would be simply to split the wave into these two components. These would then travel through the H and V channels respectively, half the original intensity being detected in each. In contrast, photons cannot be split, but they can be considered to be in a superposition state until a measurement ‘collapses’ the system into one or other of its possible outcomes.
Quantum physics is the theory that underlies nearly all our current understanding of the physical universe. Since its invention some sixty years ago the scope of quantum theory has expanded to the point where the behaviour of subatomic particles, the properties of the atomic nucleus and the structure and properties of molecules and solids are all successfully described in quantum terms. Yet, ever since its beginning, quantum theory has been haunted by conceptual and philosophical problems which have made it hard to understand and difficult to accept.
As a student of physics some twenty-five years ago, one of the prime fascinations of the subject to me was the great conceptual leap quantum physics required us to make from our conventional ways of thinking about the physical world. As students we puzzled over this, encouraged to some extent by our teachers who were nevertheless more concerned to train us how to apply quantum ideas to the understanding of physical phenomena. At that time it was difficult to find books on the conceptual aspects of the subject - or at least any that discussed the problems in a reasonably accessible way. Some twenty years later when I had the opportunity of teaching quantum mechanics to undergraduate students, I tried to include some references to the conceptual aspects of the subject and, although there was by then a quite extensive literature, much of this was still rather technical and difficult for the non-specialist.