The least squares method is among the most widely used data analysis and parameter estimation tools. Its development is associated with the work of Gauss, the nineteenth century German geophysicist, who introduced many innovations in computation, geophysics, and mathematics, most of which continue to be in wide use today. We will introduce least squares from two viewpoints, one based on probability arguments and considering the data to be contaminated with random errors and the other based on a linear algebra viewpoint and involving the solution of simultaneous linear equations. We will employ these two viewpoints to develop the least squares approach, using, as an example, the fitting of polynomials to a time series. While this might appear to be a departure from linear filters and related topics, we show in subsequent chapters that in fact least squares serves as a powerful digital filter design tool. We will also find that a stochastic viewpoint (in which time series values are considered to be random variables; see ) leads also to the use of least squares in the development of prediction and interpolation filters.
Review the options below to login to check your access.
Log in with your Cambridge Higher Education account to check access.
If you believe you should have access to this content, please contact your institutional librarian or consult our FAQ page for further information about accessing our content.