To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This chapter covers applications of quantum computing in the area of continuous optimization, including both convex and nonconvex optimization. We discuss quantum algorithms for computing Nash equilibria for zero-sum games and for solving linear, second-order, and semidefinite programs. These algorithms are based on quantum implementations of the multiplicative weights update method or interior point methods. We also discuss general quantum algorithms for convex optimization which can provide a speedup in cases where the objective function is much easier to evaluate than the gradient of the objective function. Finally, we cover quantum algorithms for escaping saddle points and finding local minima in nonconvex optimization problems.
This chapter covers quantum interior point methods, which are quantum algorithmic primitives for application to convex optimization problems, particularly linear, second-order, and semidefinite programs. Interior point methods are a successful classical iterative technique that solve a linear system of equations at each iteration. Quantum interior point methods replace this step with quantum a quantum linear system solver combined with quantum tomography, potentially offering a polynomial speedup.
One of the aims of this book is to show how the evolution of the data industry and the organisations that use data in all its forms has facilitated the current wave of innovation in AI. The two sectors are inextricably linked, with data acting as the fuel powering the rapidly developing plethora of AI services.
In Chapter 1, we have seen how the first transcribing and storing of information had its roots in commerce, with technology evolving to better manage and share data within and between organisations. More recently, data-driven innovation has resulted in the creation of some of the world's largest and most profitable companies. During the second half of the 20th century, AI has moved from universities and research centres into commercial applications, with businesses’ and the public's imaginations being captured by GenAI as a tangible and usable product. We have also examined some of the research and frameworks that explain the drivers of innovation with a focus on the impact of digital platforms. Finally, we looked at how the recent wave of data-powered platforms and concerns about the applications of AI are impacting political and legal initiatives around the world.
These developments are all in a state of flux and it will take several years before the economic and social implications of this data-driven AI revolution are fully understood. This chapter considers how this all might play out in the medium term in the context of how AI technologies will evolve (and what this means for the data sector and businesses) as well as the broader social and economic impacts.
Predicting the future is fraught with dangers and, for the most part, inaccurate. However, looking at recent precedents in the ways new technologies have diffused, the unchanging realities of business and the need to generate profits, and the current state of AI technologies, we can make some broad assumptions in this space.
Books on vehicle attitude and motion often use tensors in their analyses, and I have discussed the reasons for that in a previous chapter. But tensors also carry an esotericism arising from being used to quantify the curved spacetime of general relativity. And so I end the book by telling the inquisitive reader how tensors ‘work’ more generally, and how this more advanced topic makes quick work of calculating the gradient, divergence, laplacian, and curl of vector calculus. I end with a discussion of parallel transport, which has found its way into the exotic ‘wander azimuth’ axes used in some navigation systems.
This chapter covers the quantum algorithmic primitive called Gibbs sampling. Gibbs sampling accomplishes the task of preparing a digital representation of the thermal state, also known as the Gibbs state, of a quantum system in thermal equilibrium. Gibbs sampling is an important ingredient in quantum algorithms to simulate physical systems. We cover multiple approaches to Gibbs sampling, including algorithms that are analogues of classical Markov chain Monte Carlo algorithms.
I derive the important equation that relates the time derivative of a vector computed in one frame to that computed in another frame. I make the point that we must understand the distinction between frames and coordinates to appreciate what the equations are saying. That discussion leads naturally to the concept of centrifugal and Coriolis forces in rotating frames. I use the frame-dependent time derivative to derive some equations for robotics, and finish with a wider discussion of the time derivative for tensors and in fluid flow.
We are entering a new phase in the information revolution driven by the introduction of new artificial intelligence (AI) technologies and how they are being used to transform the data amassed within organisations over the previous 30 or so years. This revolution started hundreds of years ago when moveable type was used to print books at a scale and speed not previously possible. The rise of mass media and then mass communications in the form of telecommunication networks, coupled with the data processing capabilities of computers, powered the next phases. Thirty years ago, the expansion of the internet and the widespread adoption of the World Wide Web (WWW) as a means to publish and share information brought the digital revolution into our homes and businesses. Mobile computing has put powerful, always connected computers in the pockets of most people in the industrialised world and social networks have provided platforms for individuals to reach billions of others with their thoughts and ideas.
A result of these innovations and the ways they have been used is a world awash with information, most of which sits unused, unstructured and hidden away in archives and data silos within the organisations, public and private, that created it (Lange, 2023). Despite significant advances in information management techniques and technologies, only a fraction of the value of this data is being realised. However, we are now at an inflection point where much of the infrastructure is in place to begin the process of changing that.
This chapter covers applications of quantum computing in the area of nuclear and particle physics. We cover algorithms for simulating quantum field theories, where end-to-end problems include computing fundamental physical quantities and scattering cross sections. We also discuss simulations of nuclear physics, which encompasses individual nuclei as well as dense nucleonic matter such as neutron stars.
This chapter starts by showing that the DCM is a rotation matrix, and vice versa. I introduce Euler matrices as important examples of rotation matrices. I give examples extracting angle–axis information from a DCM. This chapter includes a study of what tensors are, and their role in this subject.
This chapter covers the quantum Fourier transform, which is an essential quantum algorithmic primitive that efficiently applies a discrete Fourier transform to the amplitudes of a quantum state. It features prominently in quantum phase estimation and Shor’s algorithm for factoring and computing discrete logarithms.
This chapter covers applications of quantum computing relevant to the financial services industry. We discuss quantum algorithms for the portfolio optimization problem, where one aims to choose a portfolio that maximizes expected return while minimizing risk. This problem can be formulated in several ways, and quantum solutions leverage methods for combinatorial or continuous optimization. We also discuss quantum algorithms for estimating the fair price of options and other derivatives, which are based on a quantum acceleration of Monte Carlo methods.
I introduce an important way to think about and construct a DCM: by implementing a yaw–pitch–roll sequence of rotations on a model aircraft. This does away with the widespread but rather involved method of describing the relative orientation of two axis sets by drawing them with a common origin. For this, we must distinguish the idea of a rotation in a sequence being about either a ‘space-fixed’ axis or a ‘carried-along’ axis. Users of these terms tend to fall into two groups, ‘active’ and ‘passive’. I state the ‘fundamental theorem of rotation sequences’, which does away with any need for the reader to stand in one group or the other. I also discuss the extraction of Euler angles from a DCM, and examine infinitesimal rotations. I discuss two methods of interpolating from an initial to a final orientation; one of these is used widely in computer graphics, but both methods must be discussed for the computer-graphics method to be understood. I end with a calculation of the position and attitude of a robot arm.
This chapter covers the quantum algorithmic primitives of amplitude amplification and amplitude estimation. Amplitude amplification is a generalization of Grover’s quantum algorithm for the unstructured search problem. Amplitude estimation can be understood in a similar framework, where it utilizes quantum phase estimation to estimate the value of the amplitude or probability associated with a quantum state. Both amplitude amplification and amplitude estimation provide a quadratic speedup over their classical counterparts, and feature prominently as an ingredient in many end-to-end algorithms.
This chapter covers applications of quantum computing in the area of quantum chemistry, where the goal is to predict the physical properties and behaviors of atoms, molecules, and materials. We discuss algorithms for simulating electrons in molecules and materials, including both static properties such as ground state energies and dynamic properties. We also discuss algorithms for simulating static and dynamic aspects of vibrations in molecules and materials.