To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Some modern implementations of vector concepts rely heavily on a precise knowledge of time. Measurements of time, both ancient and modern, have always been heavily tied to Earth’s rotation, and so this rotation must be described in detail. I begin that task by describing Earth’s orientation relative to the solar system and the stars, and use a DCM to quantify Earth’s orientation at a given moment. This introduces the idea of Universal Time, UT1. Further concepts require a short discussion of relativity, both special and general, which I do by using a balloon to describe curved spacetime. The result is UTC, our modern ‘Greenwich Mean Time’. Measuring time over long periods is made easy through the concept of the Julian day, and so I discuss the Julian and Gregorian calendars. I include a detailed example of using these ideas to calculate the sight direction of a star at some time and place on Earth.
In the Preface, we motivate the book by discussing the history of quantum computing and the development of the field of quantum algorithms over the past several decades. We argue that the present moment calls for adopting an end-to-end lens in how we study quantum algorithms, and we discuss the contents of the book and how to use it.
This chapter explores the rapidly developing policies of national and international governmental bodies relating to data and AI, the emerging resulting legislation and the ethical concerns leading to these initiatives. As policies, laws and rulings are evolving to deal with the new challenges posed by AI and the data it feeds off, this chapter presents a snapshot of the legal situation in late 2024; however, the issues underpinning policy and legislative changes are ongoing. These centre on concerns with privacy and data protection, copyright, monopolistic practices and trust as well as efforts to stimulate economic growth and international competitiveness. The chapter focuses on developments in the UK and the US as well as across the EU, which allows for policies and legislation to be compared and highlights different national concerns and priorities.
Ethical concerns
Scientific breakthroughs and new technologies have long been a source of alarm for philosophers, writers and politicians. This is particularly true where questions of what it is to be human and how far machines can replace human activities emerge. Mary Shelley's 1818 creation of the Frankenstein monster highlighted concerns emerging from the enlightenment and the move from a society based on superstition to one more founded on the principles of science (Gunkel, 2024, 3). Science fiction novels and films – from Metropolis in 1927 and Brave New World in 1932 to 2001: A Space Odyssey in 1968 and Bladerunner in 1982 – developed these ideas in different and unsettling ways. While public policies and laws change fairly frequently in response to events and differing priorities, the values and ethics that underpin them are more constant.
The previous chapter described Earth’s orientation. I now build on that to construct orbital theory with a greater emphasis on vectors and coordinates than is traditional in that subject. I use Euler angles, rotation sequences, and the theory constructed around these in previous chapters to simplify what can often be a confusing barrage of notation in orbital theory. I include two very detailed examples here: sighting an Earth satellite and sighting Jupiter.
Rigid-body dynamics uses vectors heavily, and in particular the angular velocity vector described in a previous chapter. I derive the main quantities and results of the subject: angular momentum, moment of inertia, torque, and the relevant conservation laws. Examples are the spinning top and precessing bicycle wheel. I also provide a detailed calculation of Earth’s precession period arising from the gravity of the Sun and Moon.
Vehicle attitude is typically quantified by a DCM, a quaternion, or a triplet of Euler angles. I discuss how each of these objects changes with attitude by deriving the well-known time derivative of each. That requires the concept of angular velocity, which I discuss in detail. I end the chapter by describing why time derivatives of Euler angles cause so much confusion to many practitioners.
This chapter covers the quantum adiabatic algorithm, a quantum algorithmic primitive for preparing the ground state of a Hamiltonian. The quantum adiabatic algorithm is a prominent ingredient in quantum algorithms for end-to-end problems in combinatorial optimization and simulation of physical systems. For example, it can be used to prepare the electronic ground state of a molecule, which is used as an input to quantum phase estimation to estimate the ground state energy.
This chapter covers quantum linear system solvers, which are quantum algorithmic primitives for solving a linear system of equations. The linear system problem is encountered in many real-world situations, and quantum linear system solvers are a prominent ingredient in quantum algorithms in the areas of machine learning and continuous optimization. Quantum linear systems solvers do not themselves solve end-to-end problems because their output is a quantum state, which is one of its major caveats.
The title of this chapter may be provocative to some as it suggests radical change driven by data and AI are leading a revolution in how organisations will operate in the near future. Caution over analysts and pundits calling technological advances ‘revolutions’ is wise (as we shall see in Chapter 5) and there is no shortage of examples where irrational exuberance has got the better of writers and forecasters. As the management guru, Peter Drucker, is quoted as saying, ‘Trying to predict the future is like trying to drive down a country road at night with no lights while looking out the back window’ (Guber, 2014). This is particularly true with trying to forecast how new technologies will evolve and whether they will succeed in the marketplace. Established technology vendors may have a vested interest in playing down the significance of a new innovation developed outside their own company, while businesses launching new products may go in the opposite direction.
Over-confidence in the existing order was demonstrated in 1876 when the president of the telegraph giant, Western Union, was offered the patent on the telephone for US$100,000 and stated, ‘This “telephone” has too many shortcomings to be seriously considered as a means of communication’ (Yeuh, 2019). Less than 30 years later, his company was bought by telephony ‘upstart’ AT&T. On the other side, we have WeWork, once valued at US$65 billion by Goldman Sachs for its innovative use of technology to revolutionise officeletting but sold for approximately one hundredth of that figure a few years later (Neate, 2019; Sherman, 2024). Hubris and a confused business model were to blame there.
While acknowledging the difficulties of predicting technology adoption curves, the contention of this book is that we are at the start of a revolution in business practices driven by data and AI. Chapter 5 will explore how this may take shape over the coming five to ten years. This chapter examines a number of current developments and how organisations are already experimenting with and deploying data-driven AI services.
I start by invoking the ‘fundamental rule of calculus notation’ to ensure the correct translation of an English sentence into the language of calculus. As examples, I derive the ‘rocket equation’ and the standard expression for the gravitational potential of a sphere. I discuss the importance of treating units properly. I make the important point that a frame is not the same as a system of coordinates. I distinguish between ‘proper vectors’ and ‘coordinates vectors’, which is needed for a proper understanding of transforming coordinates. Because the study of vehicle attitude is built on basis vectors, I show how to construct these from both an intuitive viewpoint and a purely mathematical viewpoint.
This chapter presents an introduction to the theory of quantum fault tolerance and quantum error correction, which provide a collection of techniques to deal with imperfect operations and unavoidable noise afflicting the physical hardware, at the expense of moderately increased resource overheads.
This chapter covers the quantum algorithmic primitive called quantum gradient estimation, where the goal is to output an estimate for the gradient of a multivariate function. This primitive features in other primitives, for example, quantum tomography. It also features in several quantum algorithms for end-to-end problems in continuous optimization, finance, and machine learning, among other areas. The size of the speedup it provides depends on how the algorithm can access the function, and how difficult the gradient is to estimate classically.
Unit basis vectors emerged from Hamilton’s quaternions, and quite literally form the basis of rotation and attitude. I begin with their role in the dot product, and then study the matrix determinant. This determines the handedness of any three vectors, which is necessary for building a right-handed cartesian coordinate system. That idea naturally gives rise to the cross product, which I study in some detail, including in higher dimensions. The chapter ends with comments on matrix multiplication, and in particular the fast multiplication of sparse 3×3 matrices that we use frequently later in the book.
This chapter covers quantum algorithms for numerically solving differential equations and the areas of application where such capabilities might be useful, such as computational fluid dynamics, semiconductor chip design, and many engineering workflows. We focus mainly on algorithms for linear differential equations (covering both partial and ordinary linear differential equations), but we also mention the additional nuances that arise for nonlinear differential equations. We discuss important caveats related to both the data input and output aspects of an end-to-end differential equation solver, and we place these quantum methods in the context of existing classical methods currently in use for these problems.
This chapter covers the quantum algorithmic primitive of approximate tensor network contraction. Tensor networks are a powerful classical method for representing complex classical data as a network of individual tensor objects. To evaluate the tensor network, it must be contracted, which can be computationally challenging. A quantum algorithm for approximate tensor network contraction can provide a quantum speedup for contracting tensor networks that satisfy certain conditions.