To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The foundations of modern probability theory are briefly presented and discussed for both discrete and continuous stochastic variables. Without daring to give a rigorous mathematical construction – but citing different extremely well-written handbooks on the matter – the axiomatic theory of Kolmogorov and the concepts of joint, conditional, and marginal probability are introduced, along with the operations of union and intersection of generic random events. Eventually, Bayes’ formula is put forward with some examples. This will be at the cornerstone of statistical inference methods reported in Chapters 5 and 6.
This short chapter aims at motivating the interest for statistical mechanics. It starts by a brief description of the historical context within which the theory has developed, and ponders its status, or lack thereof, in the public eye. A first original parallel of the use of statistics with mechanics is drawn in the context of error propagation analysis, which can also be treated within statistical mechanics. With regard to situations, statistical mechanics can be applied for, two categories are distinguished: experimental/protocol error or observational state underdetermining the mechanical state of the system. The rest of the chapter puts the emphasis on this latter category, and explains how statistical mechanics plays the role of ‘Rosetta Stone’ translating between different modes of description of the same system, thereby giving tools to infer relations between observational variables, for which we usually do not have any fundamental theory, from the physics of the underlying constituents, which is presumed to be that of Hamiltonian classical or quantum mechanics.
Kinetic theory is summarized as a mechanistic approach to thermodynamics, including the equilibrium state equation of an ideal gas and a phenomenological approach to its transport properties. The Boltzmann model of the ideal gas is described by the evolution equation of its distribution function in molecular space. The H-theorem is proved for both the uniform and nonuniform cases. The theorem of additive invariants allows to approach a fundamental formulation of hydrodynamic equations for both the ideal situation of an inviscid flow and for the more interesting case of a viscous flow.
The main ideas are introduced in a historical context. Beginning with phase retrieval and ending with neural networks, the reader will get a sense of the book’s broad scope.
An isolated system is described by a classical Hamiltonian dynamics. In the long-time limit, the trajectory of such a system yields a histogram, i.e., a distribution for any observable. With one plausible assumption, introduced here as a fundamental principle, this histogram is shown to lead to the microcanonical distribution. Pressure, temperature, and chemical potential can then be identified microscopically. This dynamical approach thus recovers the results that are often obtained for equilibrium by minimizing a postulated entropy function.
In this chapter, we introduce the reader to basic concepts in machine learning. We start by defining the artificial intelligence, machine learning, and deep learning. We give a historical viewpoint on the field, also from the perspective of statistical physics. Then, we give a very basic introduction to different tasks that are amenable for machine learning such as regression or classification and explain various types of learning. We end the chapter by explaining how to read the book and how chapters depend on each other.
This chapter explores the pivotal role of modeling as a conduit between diverse data representations and applications in real, complex systems. The emphasis is on portraying modeling in terms of multivariate probabilities, laying the foundation for the probabilistic data-driven modeling framework.
Chapter 1 begins by re-examining the textbook quantum postulates. It concludes with the realization that some of them are inconsistent with quantum mathematics, but also that they may not have to be postulated. Indeed, in the following two chapters it is shown that their consequences follow from the other, consistent postulates. This simplification of the quantum foundations provides a consistent, convenient, and solid starting point. The emergence of the classical from the quantum substrate is based on this foundation of “core quantum postulates”—the “quantum credo”. Discussion of the postulates is accompanied by a brief summary of their implications for the interpretation of quantum theory. This discussion touches on questions of interpretation that are implicit throughout the book, but will be addressed more fully in Chapter 9. Chapter 1 ends with a “decoherence primer” that provides a quick introduction to decoherence (discussed in detail in Part II). Its aim is to provide the reader with an overview of the process that will play an important role throughout the book, and to motivate Chapters 2 and 3 that lay the foundations for the physics of decoherence (Part II) as well as for quantum Darwinism, the subject of Chapters 7 and 8.
The Green’s function method is among the most powerful and versatile formalisms in physics, and its nonequilibrium version has proved invaluable in many research fields. With entirely new chapters and updated example problems, the second edition of this popular text continues to provide an ideal introduction to nonequilibrium many-body quantum systems and ultrafast phenomena in modern science. Retaining the unique and self-contained style of the original, this new edition has been thoroughly revised to address interacting systems of fermions and bosons, simplified many-body approaches like the GKBA, the Bloch equations, and the Boltzmann equations, and the connection between Green’s functions and newly developed time-resolved spectroscopy techniques. Small gaps in the theory have been filled, and frequently overlooked subtleties have been systematically highlighted and clarified. With an abundance of illustrative examples, insightful discussions, and modern applications, this book remains the definitive guide for students and researchers alike.
Chapter 1 discusses the motivation for the book and the rationale for its organization into four parts: preliminary considerations, evaluation for classification, evaluation in other settings, and evaluation from a practical perspective. In more detail, the first part provides the statistical tools necessary for evaluation and reviews the main machine learning principles as well as frequently used evaluation practices. The second part discusses the most common setting in which machine learning evaluation has been applied: classification. The third part extends the discussion to other paradigms such as multi-label classification, regression analysis, data stream mining, and unsupervised learning. The fourth part broadens the conversation by moving it from the laboratory setting to the practical setting, specifically discussing issues of robustness and responsible deployment.
Network science has exploded in popularity since the late 1990s. But it flows from a long and rich tradition of mathematical and scientific understanding of complex systems. We can no longer imagine the world without evoking networks. And network data is at the heart of it. In this chapter, we set the stage by highlighting network sciences ancestry and the exciting scientific approaches that networks have enabled, followed by a tour of the basic concepts and properties of networks.
This chapter provides a motivation for this book, outlining the interests of economists in artificial intelligence, describing who this book is aimed at, and laying out the structure of the book.
I introduce the problem of “dry active matter” more precisely, describing the symmetries (both underlying, and broken) of the state I wish to consider, and also discuss how shocking it is that such systems can exhibit long-ranged order – that is, all move together – even in d = 2.
In this chapter we draw motivation from real-world networks and formulate random graph models for them. We focus on some of the models that have received the most attention in the literature, namely, Erdos–Rényi random graphs, inhomogeneous random graphs, configuration models, and preferential attachment models. We follow Volume 1, both for the motivation as well as for the introduction of the random graph models involved. Furthermore, we add some convenient additional results, such as degree-truncation for configuration models and switching techniques for uniform random graphs with prescribed degrees. We also discuss preliminaries used in the book, for example concerning power-law distributions.
Stop. Take a moment to look around. What do you see? No matter where you are, you are likely perceiving a world consisting of things. Maybe you are reading this book in a coffee shop, and if so, you probably see people, cups, books, chairs, and so on. You see a world of objects with properties, yourself included: white cups are on wooden tables, people sitting in chairs are reading books and talking with one another. At the same time, you are a subject, responding to this world and actively bringing yourself and these objects into interrelation. And yet, the world of objects with properties that you are perceiving is but one slice of a complex reality.
This concise and self-contained introduction builds up the spectral theory of graphs from scratch, with linear algebra and the theory of polynomials developed in the later parts. The book focuses on properties and bounds for the eigenvalues of the adjacency, Laplacian and effective resistance matrices of a graph. The goal of the book is to collect spectral properties that may help to understand the behavior or main characteristics of real-world networks. The chapter on spectra of complex networks illustrates how the theory may be applied to deduce insights into real-world networks.
The second edition contains new chapters on topics in linear algebra and on the effective resistance matrix, and treats the pseudoinverse of the Laplacian. The latter two matrices and the Laplacian describe linear processes, such as the flow of current, on a graph. The concepts of spectral sparsification and graph neural networks are included.