To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
How does a person curious about or wanting to do science learn it? The thesis of this book is learning science requires not only digging deeply into the subject but learning how to learn. It is not enough to learn facts about the history of science or to study chemistry, biology, or physics per se. One must also have a sense of the roles of psychological and social factors in science. We can read about the development of Newtonian mechanics or quantum physics, but that does not tell us why Newton, or Planck, Bohr, Born, or Heisenberg studied these subjects in the first place or what thought processes they used in their work. It does not tell us how their personalities and intuition drove their work. It also does not tell us why the work of these scientists was or was not accepted by the scientific community, rightly or wrongly. Most importantly, it does not tell us why we think we know what we do or even if what we study is real. Epistemology and metaphysics are required here. This chapter explains why such knowledge is obligatory if one is to do science well.
In this chapter we provide an overview of data modeling and describe the formulation of probabilistic models. We introduce random variables, their probability distributions, associated probability densities, examples of common densities, and the fundamental theorem of simulation to draw samples from discrete or continuous probability distributions. We then present the mathematical machinery required in describing and handling probabilistic models, including models with complex variable dependencies. In doing so, we introduce the concepts of joint, conditional, and marginal probability distributions, marginalization, and ancestral sampling.
This chapter introduces the quantum transfer matrix renormalization group (QTMRG). It is a method of studying the thermodynamic and correlation functions of one-dimensional quantum lattice models. The RG transformation matrices are determined using the criteria presented in the preceding chapter and used to update the transfer matrix and other physical quantities. The spin-1/2 and spin-1 antiferromagnetic Heisenberg models are used to demonstrate the accuracy and efficiency of the method.
This chapter introduces the density matrix renormalization group (DMRG) in real space. The infinite and finite lattice algorithms of DMRG, and the approaches for targeting more than one eigenstate and for implementing DMRG in two dimensions by mapping a two-dimensional lattice onto a one-dimensional one, are discussed. The one-dimensional antiferromagnetic Heisenberg model of both integer and half-integer spins is used to demonstrate the method.
In this chapter we formulate the general regression problem relevant to function estimation. We begin with simple frequentist methods and quickly move to regression within the Bayesian paradigm. We then present two complementary mathematical formulations: one that relies on Gaussian process priors, appropriate for the regression of continuous quantities, and one that relies on Beta–Bernoulli process priors, appropriate for the regression of discrete quantities. In the context of the Gaussian process, we discuss more advanced topics including various admissible kernel functions, inducing point methods, sampling methods for nonconjugate Gaussian process prior-likelihood pairs, and elliptical slice samplers. For Beta–Bernoulli processes, we address questions of posterior convergence in addition to applications. Taken together, both Gaussian processes and Beta–Bernoulli processes constitute our first foray into Bayesian nonparametrics. With end of chapter projects, we explore more advanced modeling questions relevant to optics and microscopy.
This chapter introduces two kinds of RG methods for solving the leading eigenvalue and eigenvectors of a transfer matrix: TMRG (transfer matrix renormalization group) and CTMRG (corner transfer matrix renormalization group). These methods are developed to study the thermodynamic properties of two-dimensional classical statistical models. Furthermore, in the framework of MPS, the fixed-point equations of these methods are derived, and the steps for efficiently solving these equations are outlined.
According to the Schrödinger equation, a particle with wave character and mass in the presence of a potential may be described as a state that is a function of space and time. Space and time are assumed to be smooth and continuous. The potential can localize the particle to one region of space forming a bound state.
This chapter reformulates QTMRG using the language of MPS and introduces the concept of bicanonical MPS and the method of biorthogonalization. The fixed-point equations for determining the local tensors of MPS in a translation-invariant system of one or more than one site in a unit cell are derived. The steps for solving these equations in the scheme of biorthonormalization are discussed.
This chapter starts with an introductory survey on the physical background and historical events that lead to the emergence of the density matrix renormalization group (DMRG) and its tensor network generalization. We then briefly overview the major progress on the renormalization group methods of tensor networks and their applications in the past three decades. The tensor network renormalization was initially developed to solve quantum many-body problems, but its application field has grown constantly. It has now become an irreplaceable tool for investigating strongly correlated problems, statistical physics, quantum information, quantum chemistry, and artificial intelligence.
This chapter introduces the tensor network representation of physical operators, especially the matrix product representation of model Hamiltonians, called the matrix product operators (MPO), and the quantum transfer matrix representation of partition functions with different boundary conditions or with an impurity. The leading eigenvalue and eigenvectors of the quantum transfer matrix determine all thermodynamic quantities. It allows us to investigate thermodynamics without solving the full energy spectra of the Hamiltonian.
Scattering experiments are one of our most important tools for extracting information about the structure and interactions of microscopic systems. In these experiments, we prepare a beam of particles of a given type and we direct it towards a target. The interaction of the particles in the beam with those of the target may lead to various phenomena: changes in the direction and the energy of incoming particles, absorption of incoming particles, the appearance of new species of particles, and so on. The target is surrounded by particle detectors that identify the particles that exit the interaction region and measure their momenta.