We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In this chapter, we introduce one of the most important computational tools in linear algebra – the determinants. First, we discuss some motivational examples. Next we present the definition and basic properties of determinants. Then we study some applications of determinants, including the determinant characterization of an invertible matrix or mapping, Cramer’s rule for solving a system of nonhomogeneous equations, and a proof of the Cayley–Hamilton theorem.
The content of this chapter may serve as, yet, another supplemental topic to meet the needs and interests beyond those of a usual course curriculum. Here we shall present an oversimplified, but hopefully totally transparent, description of some of the fundamental ideas and concepts of quantum mechanics, using a pure linear algebra formalism.
In this chapter, we consider vector spaces over a field that is either the real or complex numbers. We shall start from the most general situation of scalar products. We then consider the situations when scalar products are nondegenerate and positive definite, respectively.
In this chapter, we present an introduction to an important area of contemporary quantum physics: quantum information and quantum entanglement. After a brief introduction regarding why and how linear algebra is so useful in this area, we first consider the concepts of quantum bits and quantum gates in quantum information theory. We next explore some geometric features of quantum bits and quantum gates. Then we study the phenomenon of quantum entanglement. In particular, we shall clarify the notions of untangled and entangled quantum states and establish a necessary and sufficient condition to characterize or divide these two different categories of quantum states. Finally, we present Bell’s theorem which is of central importance for the mathematical foundation of quantum mechanics implicating that quantum mechanics is nonlocal.
In this chapter, we exclusively consider vector spaces over the field of reals unless otherwise stated. First, we present a general discussion on bilinear and quadratic forms and their matrix representations. We also show how a symmetric bilinear form may be uniquely represented by a self-adjoint mapping. Then we establish the main spectrum theorem for self-adjoint mappings based on a proof of the existence of an eigenvalue using Calculus. Next we focus on characterizing the positive definiteness of self-adjoint mappings. After these we study the commutativity of self-adjoint mappings. As applications, we show the effectiveness of using self-adjoint mappings in computing the norm of a mapping between different spaces and in the formalism of least squares approximations.
In this chapter, we extend our study of linear algebraic structures to multilinear ones that have broad and profound applications beyond those covered by linear structures. First, we give some remarks on the rich applications of multilinear algebra and consider multilinear forms in a general setting as a starting point that directly generates bilinear forms already studied. Next, we specialize our discussion to consider tensors and their classifications. Then, we elaborate on symmetric and antisymmetric tensors and investigate their properties and characterizations. Finally, we discuss exterior algebras and the Hodge dual correspondence.
In this chapter, we extend our study on real quadratic forms and self-adjoint mappings to the complex situation. We begin by a discussion on the complex version of bilinear forms and the Hermitian structures. We will relate the Hermitian structure of a bilinear form with representing it by a unique self-adjoint mapping. Then we establish the main spectrum theorem for self-adjoint mappings. Next we focus again on the positive definiteness of self-adjoint mappings. We explore the commutativity of self-adjoint mappings and apply it to obtain the main spectrum theorem for normal mappings. We also show how to use self-adjoint mappings to study a mapping between two spaces.
In this chapter, we present two important and related problems in data analysis: the low-rank approximation and principal component analysis (PCA), both based on singular value decomposition. First, we consider the low-rank approximation problem for mappings between two vector spaces. Next we specialize on the low-rank approximation problem for matrices in both induced norm and the Frobenius norm, which are of independent interest for applications. Then we consider PCA. These results are also useful in machine learning. Furthermore, as an extension of the ideas and methods, we present a study of some related matrix nearness problems.
In this chapter, we study vector spaces and their basic properties and structures. We start by stating the definition and discussing examples of vector spaces. Next we introduce the notions of subspaces, linear dependence, bases, coordinates, and dimensionality. And then we consider dual spaces, direct sums, and quotient spaces. Finally, we cover normed vector spaces.
In this chapter, we consider linear mappings over vector spaces. We begin by stating the definition and discussing the structural properties of linear mappings. Then we introduce the notion of adjoint mappings and illustrate some of their applications. Next we focus on linear mappings from a vector space into itself and study a series of important concepts such as invariance and reducibility, eigenvalues and eigenvectors, projections, nilpotent mappings, and polynomials of linear mappings. Finally, we discuss the use of norms of linear mappings and present a few analytic applications.
In this chapter, we present a few selected subjects that are important in applications as well but are not usually included in a standard linear algebra course. These subjects may serve as supplemental or extracurricular materials. The first subject is the Schur decomposition theorem, the second is about the classification of skew-symmetric bilinear forms, the third is the Perron–Frobenius theorem for positive matrices, and the fourth concerns the Markov or stochastic matrices.
A rich and important area for the applications of linear algebra is machine learning. In machine learning, one aims to achieve optimized or learned understanding of various kinds of real-world phenomena from data collected or observed, without real comprehension of the functioning mechanisms of such phenomena. These functioning mechanisms are often impossible or unpractical to grasp anyway. In this chapter, we present several introductory and fundamental problems in supervised machine learning including linear regression, data classification, and logistic regression and the mathematical and computational methods associated.