To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
A snapshot of the theory of orthogonal polynomials on the line or on the circle is presented from the point of view of the asymptotics of bounded point evaluation constants, touching on classical fields of approximation theory of real functions, as well as complex analytic functions of a single variable.
The stability of the Christoffel-Darboux kernel under small perturbations of the generating measure is established via precise quantitative bounds. Trace-class perturbations of the Hessenberg matrix attached to a 2D measure are linked to the asymptotic invariance of the Christoffel function, in an exact separation algorithm of outliers from clouds formed by bounded point evaluations for complex analytic functions.
Starting from the explicit formulas known for simple multivariate geometries (balls, cube, simplex) the discussion moves to recent advances in pluripotential theory. The essential Bernstein-Markov property of a measure is necessary for deriving asymptotics of the Christoffel function outside the support of the generating measure, and in some fortunate situations also inside the support. A balance between strong and weak limits enters into the game.
Several applicationsare described of the Christoffel-Darboux kernel in computational statistics, including parametric (polynomial) regression, optimal design (and an interpretation in computational geometry), density approximation, support inference and outlier detection. Theoretical results leverage statistical concentration and properties of the Christoffel-Darboux kernel. They are illustrated with numerical experiments.
Outside the immediate statistical applications, two notable implications of the Christoffel-Darboux kernel are sketched: on the effective semialgebraic approximation of nonsmooth functions and on the spectral analysis of Koopman's operator attached to some intricate dynamical systems.
A spectral characterization of the Christoffel function is provided via its associated moment matrix, and that of Dirac measures. A first (Hahn-Banach) extension of the Christoffel function to Lp spaces is characterized. A second extension is also provided and analyzed where squares of polynomials (hence positive everywhere) are replaced by larger convex cones of polynomials positive on the support of the underlying measure, in particular polynomials in a certain associated quadratic module.
A general overview of the book is detailed after a brief description of the main algorithm to compute and exploit Christoffel-Darboux from raw moment data.
Basic concepts and results of real and complex algebraic geometry enter into play when dealing with the estimation of measures supported by algebraic varieties. Some geometric constraints on the supporting variety are inherited from the very beginning of the modelization, such as a sphere for estimating an orientation in space. Well-adapted results of multivariate pluripotential theory play a crucial role here.
A brief introduction based on several classical examples to the correspondence between positive-definite kernels and Hilbert spaces of functions is tuned around the Christoffel-Darboux kernel and its relevance to moment problems.
The Christoffel–Darboux kernel, a central object in approximation theory, is shown to have many potential uses in modern data analysis, including applications in machine learning. This is the first book to offer a rapid introduction to the subject, illustrating the surprising effectiveness of a simple tool. Bridging the gap between classical mathematics and current evolving research, the authors present the topic in detail and follow a heuristic, example-based approach, assuming only a basic background in functional analysis, probability and some elementary notions of algebraic geometry. They cover new results in both pure and applied mathematics and introduce techniques that have a wide range of potential impacts on modern quantitative and qualitative science. Comprehensive notes provide historical background, discuss advanced concepts and give detailed bibliographical references. Researchers and graduate students in mathematics, statistics, engineering or economics will find new perspectives on traditional themes, along with challenging open problems.
The dramatic increase in computer performance has been extraordinary, but not for all computations: it has key limits and structure. Software architects, developers, and even data scientists need to understand how exploit the fundamental structure of computer performance to harness it for future applications. Ideal for upper level undergraduates, Computer Architecture for Scientists covers four key pillars of computer performance and imparts a high-level basis for reasoning with and understanding these concepts: Small is fast – how size scaling drives performance; Implicit parallelism – how a sequential program can be executed faster with parallelism; Dynamic locality – skirting physical limits, by arranging data in a smaller space; Parallelism – increasing performance with teams of workers. These principles and models provide approachable high-level insights and quantitative modelling without distracting low-level detail. Finally, the text covers the GPU and machine-learning accelerators that have become increasingly important for mainstream applications.