To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To the lay reader, much of what is written by scientists can seem barely comprehensible. Even to someone who has had some science courses in school, a sentence like “The M2 proton channel … has a 40-residue region that interacts with membranes consisting of a transmembrane helix (which mediates tetramerization, drug-binding, and channel activity), followed by a basic amphiphilic helix important for budding of the virus from cellular membranes and release (scission)” will seem as though it has been written in a language not quite identical to English (Fiorin, Carnevale, & DeGrado, 2010). As a result, the non-specialist may find much of what is accomplished by scientists mysterious and esoteric.
Philosophers of science have sometimes attempted to “humanize” scientific work by portraying scientific methods as extensions of our ordinary ways of knowing things. It is true that scientists use technologies both material and mathematical to make observations and draw conclusions that we could never achieve otherwise. Nonetheless, they observe, conjecture, infer, and decide just as we all do, if perhaps in a more systematic and sophisticated way. Although a certain kind of training is needed to understand some of the language scientists use to report their findings, those findings are not the result of magic or trickery, but of an especially refined and disciplined application of the cognitive resources we enjoy as a species.
Semidefinite programming (SDP) is an optimization model with vector or matrix variables, where the objective to be minimized is linear, and the constraints involve affine combinations of symmetric matrices that are required to be positive (or negative) semidefinite. SDPs include as special cases LPs, QCQPs, and SOCPs; they are perhaps the most powerful class of convex optimization models with specific structure, for which efficient and well-developed numerical solution algorithms are currently available.
SDPs arise in a wide range of applications. For example, they can be used as sophisticated relaxations (approximations) of non-convex problems, such as Boolean problems with quadratic objective, or rank-constrained problems. They are useful in the context of stability analysis or, more generally, in control design for linear dynamical systems. They are also used, to mention just a few, in geometric problems, in system identification, in algebraic geometry, and in matrix completion problems under sparsity constraints.
11.1 From linear to conic models
In the late 1980s, researchers were trying to generalize linear programming. At that time, LP was known to be solvable efficiently, in time roughly cubic in the number of variables or constraints. The new interior-point methods for LP had just become available, and their excellent practical performance matched the theoretical complexity bounds. It seemed, however, that, beyond linear problems, one encountered a wall.
Optimization refers to a branch of applied mathematics concerned with the minimization or maximization of a certain function, possibly under constraints. The birth of the field can perhaps be traced back to an astronomy problem solved by the young Gauss. It matured later with advances in physics, notably mechanics, where natural phenomena were described as the result of the minimization of certain “energy” functions. Optimization has evolved towards the study and application of algorithms to solve mathematical problems on computers.
Today, the field is at the intersection of many disciplines, ranging from statistics, to dynamical systems and control, complexity theory, and algorithms. It is applied to a widening array of contexts, including machine learning and information retrieval, engineering design, economics, finance, and management. With the advent of massive data sets, optimization is now viewed as a crucial component of the nascent field of data science.
In the last two decades, there has been a renewed interest in the field of optimization and its applications. One of the most exciting developments involves a special kind of optimization, convex optimization. Convex models provide a reliable, practical platform on which to build the development of reliable problem-solving software. With the help of user-friendly software packages, modelers can now quickly develop extremely efficient code to solve a very rich library of convex problems.
In this chapter we present a compact selection of numerical algorithms for performing basic matrix computations. Specifically, we describe the power iteration method for computing eigenvalues and eigenvectors of square matrices (along with some of its variants, and a version suitable for computing SVD factors); we discuss iterative algorithms for solving square systems of linear equations, and we detail the construction of the QR factorization for rectangular matrices.
7.1 Computing eigenvalues and eigenvectors
7.1.1 The power iteration method
In this section we outline a technique for computing eigenvalues and eigenvectors of a diagonalizable matrix. The power iteration (PI) method is perhaps the simplest technique for computing one eigenvalue/eigenvector pair for a matrix. It has rather slow convergence and it is subject to some limitations. However, we present it here since it forms the building block of many other more refined algorithms for eigenvalue computation, such as the Hessenberg QR algorithm, and also because interest in the PI method has been recently revived by applications to very large-scale matrices, such as the ones arising in web-related problems (e.g., Google PageRank). Many other techniques exist for computing eigenvalues and eigenvectors, some of them tailored for matrices with special structure, such as sparse, banded, or symmetric. Such algorithms are described in standard texts on numerical linear algebra.
Optimization is a technology that can be used to devise effective decisions or predictions in a variety of contexts, ranging from production planning to engineering design and finance, to mention just a few. In simplified terms, the process for reaching the decision starts with a phase of construction of a suitable mathematical model for a concrete problem, followed by a phase where the model is solved by means of suitable numerical algorithms. An optimization model typically requires the specification of a quantitative objective criterion of goodness for our decision, which we wish to maximize (or, alternatively, a criterion of cost, which we wish to minimize), as well as the specification of constraints, representing the physical limits of our decision actions, budgets on resources, design requirements that need be met, etc. An optimal design is one which gives the best possible objective value, while satisfying all problem constraints.
In this chapter, we provide an overview of the main concepts and building blocks of an optimization problem, along with a brief historical perspective of the field. Many concepts in this chapter are introduced without formal definition; more rigorous formalizations are provided in the subsequent chapters.
1.1 Motivating examples
We next describe a few simple but practical examples where optimization problems arise naturally. Many other more sophisticated examples and applications will be discussed throughout the book.