To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
These motions were such as to satisfy me, after frequently repeated observation, that they arose neither from currents in the fluid, nor from its gradual evaporation, but belonged to the particle itself.
“Microscopical Observations on the Particles Contained in the Pollen of Plants,” by Robert Brown, Philosophical Magazine, NS 4, 162–3, 1828.
Introduction
Partially hydrophobic small particles have the potential to act as stabilizing agents in many foaming processes, and they behave fairly similar in some ways to chemical surfactant molecules in that they can adsorb (attach) at the bubble interface. However, particles show several distinct differences from chemical surfactants. For example, macro-sized particles are considerably larger than the molecular dimensions exhibited by chemical surfactants. They also behave differently, in that particles cannot aggregate at the interface and are unable to buildup self-assembles and cannot solubilize in the bulk solution. Unlike chemical surfactants, it is difficult to generate bubbles or foams solely without particles, but partially hydrophobic particles can be good foam stabilizers at moderate concentrations (about 1 w%). In fact, if the particles exhibit moderate hydrophobicity, then the foams can be extremely stable (with lifetimes of the order of years). However, generally it is more convenient to add other surface-active components to the particles, such as polymers, dispersants or chemical surfactants, to ensure a higher degree of foamability and foam stability. The different features exhibited by surfactant- and particle-stabilized systems are illustrated in Fig. 8.1.
There are only a few foaming processes operating solely with particles, for example, molten metal foams (where ceramic particles are used) and sometimes hydrophobic particles (such as graphite) in the froth flotation process. Excluding the past two decades, the literature on foams stabilized by particles is sparse, but there has been a revival of attention, which was mostly because of the success achieved with particle-stabilized emulsions (Pickering emulsions). For a considerable period of time it has been well known that particles acted as stabilizers in many established industrial foaming processes, such as froth flotation of mineral particles (1), deinking flotation (2) and food processing (3). However, little effort was made to understand the mechanisms until recent years; today a considerable amount of insight has been achieved in understanding the basics.
Thoughts without content are empty, intuitions without concepts are blind.
Immanual Kant, Critique of Pure Reason, B25, 1781.
Introduction
Foams can exist in the wet, dry or solid state and can be seen almost everywhere, in the home, in the surrounding natural environment and in numerous technological applications. In fact they are prevalent, and it is almost impossible to pass through an entire day without having contact with some type of liquid or solid foam. They have several interesting properties which enable them to fill an extremely wide range of uses; for example, they possess important mechanical, rheological and frictional characteristics which enable them to behave similar to solids, liquids or gases. Under low shear, wet (bubbly) foams exhibit elastic properties similar to solid bodies, but at high shear, they flow and deform in a similar manner to liquids. On the application of pressure or temperature to wet foams, the volume changes proportionately, and this behavior resembles that of gases. Interestingly, it is the elastic and frictional properties of wet foams which lead to their application in personal hygiene products such as body lotions, foaming creams and shaving foams. While shaving, foam is applied to the skin and the layer on the blade travels smoothly over the surface, reducing the possibilities of nicking and scratching. Another example is their use as firefighting foams, where properties such as low density, reasonably good mechanical resistance and heat stability are required in order to be effective in extinguishing gasoline fires. Essentially, they act by covering the flames with a thick semi-rigid foam blanket. The low density allows the water in the foam to float even though it is generally denser than the burning oils. The chemical composition and mechanical properties of these types of foams can be varied to optimize the firefighting utility.
Foams are also found in many food items, either in finished products or incorporated during some stage in food processing. They primarily provide texture to cappuccino, bread, whipped cream, ice-cream topping, bread, cakes, aerated desserts, etc. Surprisingly, several novel types of food foams have been recently produced from cod, mushroom and potatoes, using specially designed whipping siphons powered by pressurized gas with lecithin or gelatin as alternative foaming agents to replace egg and creams (1).
This indispensable guide will equip the reader with a thorough understanding of the field of foaming chemistry. Assuming only basic theoretical background knowledge, the book provides a straightforward introduction to the principles and properties of foams and foaming surfactants. It discusses the key ideas that underpin why foaming occurs, how it can be avoided and how different degrees of antifoaming can be achieved, and covers the latest test methods, including laboratory and industrial developed techniques. Detailing a variety of different kinds of foams, from wet detergents and food foams, to polymeric, material and metal foams, it connects theory to real-world applications and recent developments in foam research. Combining academic and industrial viewpoints, this book is the definitive stand-alone resource for researchers, students and industrialists working on foam technology, colloidal systems in the field of chemical engineering, fluid mechanics, physical chemistry, and applied physics.
Preparation, I have often said, is rightly two-thirds of any venture.
Amelia Earhart
Summary
Correlation functions provide a direct way to characterize and analyze many-body systems, both theoretically and experimentally. In this chapter we review the properties of one- and two-body correlation functions in quantum systems, with emphasis on several key quantities: static density correlations that determine the energy and thermodynamic potentials, dynamic correlation functions such as response functions that describe excitations of the system, and Green's functions that are basic tools in the theory of interacting many-body systems.
Correlation functions are central quantities in the description of interacting many-body systems, both in the theoretical formulation and in the analysis of experiments. In contrast to single numbers like the total energy, correlation functions reveal far more information about the electrons, how they arrange themselves, and the spectra of their excitations. In contrast to the many-body wavefunctions that contain all the information on the system, correlation functions extract the information most directly relevant to experimentally measurable properties. Dynamic current–current correlation functions are sufficient to determine the electrical and optical properties: one-body Green's functions describe the spectra of excitations when one electron is added to or removed from the system, static and dynamic correlations are measured using scattering techniques, and so forth. In this chapter we present the basic definitions and properties of correlation functions and Green's functions that are the basis for much of the developments in the following chapters.
In general, a correlation function quantifies the correlation between two or more quantities at different points in space r, time t, or spin σ. Very often the correlation function can be specified as a function of the Fourier-transformed variables, momentum (wavevector) k, and frequency ω. It is useful to distinguish between a dynamic correlation function which describes the correlation between events at different times and a static or equal-time correlation function, by which we mean that of a property measured or computed with “snapshots” of the system. Also, the different correlation functions can be classified by the number of particles and/or fields involved.
… a general method, suitable for electronic computing machines, of calculating the properties of any substance which may be considered as composed of interacting individual molecules.
N. Metropolis et al., 1953
Summary
Quantum Monte Carlo methods have been very useful in providing exact results, or, at least, exact constraints on properties of electronic systems, in particular for the homogeneous electron gas. The results are, in many cases, more accurate than those from other quantum many-body methods, and provide unique capabilities and insights. In this chapter we introduce the general properties of stochastic methods and motivate their use on the quantum many-body problem. In particular, we discuss Markov chains and the computation of error estimates.
The methods that we introduce in the next four chapters are quite different from those in Parts II and III: stochastic or quantum Monte Carlo methods. In stochastic methods, instead of solving deterministically for properties of the quantum many-body system, one sets up a random walk that samples for the properties. Historically the most important role of QMC for the electronic structure field has been to provide input into the other methods, most notably the QMC calculation of the HEG [109], used for the exchange– correlation functional in DFT. A second important role has been as benchmarks for other methods such asGW. There are systems for which QMC is uniquely suited, for example the Wigner transition in the low-density electron gas, see Sec. 3.1. In this chapter we introduce general properties of simulations, in particular Markov chains, and error estimates. In the following chapters we will apply this theory to three general classes of quantum Monte Carlo algorithms, namely variational (Ch. 23), projector (Ch. 24), and path-integral Monte Carlo (Ch. 25); Ch. 18 already introduced the QMC calculation of the impurity Green's function used in the dynamical mean-field method.We note that there are a variety of other QMC methods not covered in this book.
Simulations
First let us define what we mean by a simulation, since the word has other meanings in applied science. The dimensionality of phase space (i.e., the Hilbert space for a quantum system) is large or infinite. Even a classical system requires the positions and momenta of all particles, and, hence, the phase space for N classical particles has dimensionality 6N.
The art of being wise is the art of knowing what to overlook.
William James
Summary
This chapter is devoted to idealized models and theoretical concepts that underlie the topics in the rest of this book. Among the most dramatic effects are the Wigner and Mott transitions, exemplified by electrons in a homogeneous background of positive charge and by the Hubbard model of a crystal. Fermi liquid theory is the paradigm for understanding quasi-particles and collective excitations in metals, building on a continuous link between a non-interacting and an interacting system. The Luttinger theorem and Friedel sum rule are conservation laws for quantities that do not vary at all with the interaction. The Heisenberg and Ising models exemplify the properties of localized electronic states that act as spins. The Anderson impurity model is the paradigm for understanding local moment behavior and is used directly in dynamical mean-field theory.
The previous chapters discuss examples of experimental observations where effects of interactions can be appreciated with only basic knowledge of physics and chemistry. The purpose of this chapter is to give a concise discussion of models that illustrate major characteristics of interacting electrons. These are prototypes that bring out features that occur in real problems, such as the examples in the previous chapter. They are also pedagogical examples for the theoretical methods developed later, with references to specific sections.
The Wigner transition and the homogeneous electron system
The simplest model of interacting electrons in condensed matter is the homogeneous electron system, also called homogeneous electron gas (HEG), an infinite system of electrons with a uniform compensating positive charge background. It was originally introduced as a model for alkali metals. Now the HEG is a standard model system for the development of density functionals and a widely used test system for the many-body perturbation methods in Chs. 10–15. It is also an important model for quantum Monte Carlo calculations, described in Chs. 23–25.
To define the model, we take the hamiltonian in Eq. (1.1) and replace the nuclei by a rigid uniform positive charge with density equal to the electron charge density n.
Since the form of PN is the same as that occurring in the statistical mechanics of the classical gas (replace f2(rij) by exp[-V(rij/kT]), we can use the same integration techniques that are used in the classical problem.
W. L. McMillan, 1965
Summary
Building on the random walk methods developed in the previous chapter, we show how to compute properties of many-body trial wavefunctions using a random walk. This method, called variational Monte Carlo, is the simplest stochastic quantum many-body technique. Whereas mean-field methods are usually limited to single determinants, variational Monte Carlo can treat any correlated trial function, as long as its values are computable. We discuss how to optimize such trial wavefunctions, how to compute their momentum distribution, how to use non-local pseudopotentials, how to compute excited states, and how to correct for the finite size of the simulation cell.
Deterministic quantum methods have difficulties. For example, the Hartree–Fock method assumes the wavefunction is a single Slater determinant, neglecting correlation. If one expands as a sum of determinants, it is very difficult to have the results size-consistent since the number of determinants needed will grow exponentially with the system size. As we have seen, the DMFT method introduced in Ch. 16 assumes locality. In Ch. 6 we discussed general properties of many-body wavefunctions. Using Monte Carlo methods, we can directly incorporate correlation into a wavefunction, without having to make any further approximations other than the form of the correlation factors. In many cases the energy and other properties are very close to the exact results. Some of the usual restrictions on the form of the many-body wavefunction are not an issue in variational Monte Carlo. The most important generalization of the HF wavefunction is to put correlation directly into the wavefunction via the “Jastrow” factor. At next order, one can use the “backflow” wavefunction, in which correlation is also built into the determinant.
The variational Monte Carlo method (VMC) was first used by McMillan [44] to calculate the ground-state properties of superfluid 4He. One of the key problems at that time was whether the observed superfluid properties were a consequence of Bose condensation.
Real knowledge is to know the extent of one's ignorance.
Confucius, 500 BCE
Summary
The topic of this chapter is a small selection of the vast array of experimentally observed phenomena chosen to exemplify crucial roles played by the electron– electron interaction. Examples in the present chapter bring out the effects of correlation in ground and excited states as well as in thermal equilibrium. These raise challenges for theory and quantitative many-body methods in treating interacting electrons, the topics of the following chapters.
The title of this book is Interacting Electrons. Of course, there are no non-interacting electrons: in any system with more than one electron, the electron–electron interaction affects the energy and leads to correlation between the electrons. All first-principles theories deal with the electron–electron interaction in some way, but often they treat the electrons as independent fermions in a static mean-field potential that contains interaction effects approximately. As described in Ch. 4, the Hartree–Fock method is a variational approximation with a wavefunction for fermions that are uncorrelated, except for the requirement of antisymmetry. The Kohn–Sham approach to DFT defines an auxiliary system of independent fermions that is chosen to reproduce the ground-state density. It is exact in principle and remarkably successful in practice. However, many properties such as excitation energies are not supposed to be taken directly from the Kohn–Sham equations, even in principle. Various other methods attempt to incorporate some effect of correlation in the choice of the potential.
This chapter is designed to highlight a few examples of experimentally observed phenomena that demonstrate qualitative consequences of electron–electron interactions beyond independent-particle approximations. Some examples illustrate effects that cannot be accounted for in any theory where electrons are considered as independent particles. Others are direct experimental measurements of correlation functions that would vanish if the electrons were independent. In yet other cases, a phenomenon can be explained in terms of independent particles in some effective potential, but it is deeply unsatisfying if one has to invent a different potential for every case, even for different properties in the same material. A satisfactory theory ultimately requires us to confront the problem of interacting, correlated electrons.
This appendix summarizes the arguments of the papers by Luttinger and Ward to derive the Luttinger theorem that is stated in Sec. 3.6. This is not a proof that the theorem applies to all possible states of a crystal; it is the derivation of arguments that it applies to all states that can be analytically continued from some non-interacting system, i.e., a “normal state of matter” as defined in Sec. 3.4. The derivation is an example of the use of the T ≠ 0 Green's functions in App. D and the conclusions for T = 0.
The Luttinger theorem is a cornerstone in the theory of condensed matter. As described qualitatively in Sec. 3.6, it requires that the volume enclosed by the Fermi surface is conserved independent of interactions, i.e., it is the same as for a system of non-interacting particles. Similarly, the Friedel sum rule is the requirement that the sum of phase shifts around an impurity is determined by charge neutrality, which was derived by Friedel [163] for non-interacting electrons. This section is devoted to a short summary of the original work of Luttinger and Ward and the extension of the arguments to the Freidel sum rule [166]. Here we explicitly indicate the chemical potential μ, since the variation from μ is essential to the arguments.
There are two key points: in the interacting system the wavevector in the Brillouin zone k is conserved so that excitations can be labeled k, and the self-energy ∑k(ω) is purely real at the Fermi energy ω = μ at temperature T = 0. The latter point is an essential feature of a Fermi liquid or a “normal metal,” which is justified by the argument that the phase space for scattering at T = 0 vanishes as ω → μ (see Sec. 7.5). Thus, at the Fermi energy the Green's function as a function of k is the same as for an independent-particle problem with eigenvalues. (Of course, for an interacting system at any other energy cannot be described by independent particles.) In an independent-particle system at T = 0 the occupation numbers jump from 1 to 0 as a function of k at the Fermi surface, and in the interacting system there is still a discontinuity in nk that defines the surface (Sec. 7.5).
The calculation of a wavefunction took about two afternoons, and five wavefunctions were calculated in the whole ….
Wigner and Seitz, 1933
Summary
In order to explain many important properties of materials and phenomena, it is necessary to go beyond independent-particle approximations and directly account for many-body effects that result from electronic interaction. The many-body problem is a major scientific challenge, but there has been great progress resulting from theoretical developments and advances in computation. This chapter is a short introduction to the interacting-electron problem, with some of the history that has led up to the concepts and methods described in this book.
The many-body interacting-electron problem ranks among the most fascinating and fruitful areas of research in physics and chemistry. It has a rich history, starting from the early days of quantum mechanics and continuing with new intellectual challenges and opportunities. The vitality of electronic structure theory arises in large part from the close connection with experiment and applications. It is spurred on by new discoveries and advances in techniques that probe the behavior of electrons in depth. In turn, theoretical concepts and calculations can now make predictions that suggest new experiments, as well as provide quantitative information that is difficult or not yet possible to measure experimentally.
This book is concerned with the effects of interactions between electrons beyond independent-particle approximations. Some phenomena cannot be explained by any independent-electron method, such as broadening and lifetime of excited states and two-particle bound states (excitons) that are crucial for optical properties of materials. There are many other examples, such as the van derWaals interaction between neutral molecules that arises from the dipole–induced dipole interaction. This force, which is entirely due to correlation between electrons, is an essential mechanism determining the functions of biological systems. Other properties, such as thermodynamically stable magnetic phases, would not exist if there were no interactions between electrons; even though mean-field approximations can describe average effects, they do not account for fluctuations around the average. Ground-state properties, such as the equilibrium structures of molecules and solids, can be described by density functional theory (DFT) and the Kohn–Sham independent-particle equations. However, present approximations are often not sufficient, and for many properties the equations, when used in a straightforward way, do not give a proper description, even in principle.
It is a characteristic of wisdom not to do desperate things.
Henry David Thoreau
Summary
The essence of a mean-field method is to replace an interacting many-body problem with a set of independent-particle problems having an effective potential. It can be chosen either as an approximation for effects of interactions in an average sense or as an auxiliary system that can reproduce selected properties of an interacting system. The effective potential can have an explicit dependence on an order parameter for a system with a broken symmetry such as a ferro- or antiferromagnet. Mean-field techniques are vital parts of many-body theory: the starting points for practical many-body calculations and often the basis for interpreting the results. This chapter provides a summary of the Hartree–Fock approximation, the Weiss mean field, and density functional theory that have significant roles in the methods described in this book.
Mean-field methods denote approaches in which the interacting many-body system is treated as a set of non-interacting particles in a self-consistent field that takes into account some effects of interactions in some way. In the literature such methods are often called “one-electron”; however, in this book we use “non-interacting” or “independent-particle” to refer to mean-field concepts and approaches. We reserve the terms “one-electron” or “one-body” to denote quantities that involve quantum mechanical operators acting independently on each body in a many-body system. Mean-field approaches are relevant for the study of interacting, correlated electrons because they lead to approximate formulations that can be solved more easily than more sophisticated approaches; when judiciously chosen, mean-field solutions can yield useful, physically meaningful results, and they can provide the basis and conceptual structure for investigating the effects of correlation. The particles that are the “bodies” in a many-body theory can be the original particles with their bare masses and interactions, or, most often, they may be the solutions of a set of mean-field equations chosen to facilitate the solution of the many-body problem. A large part of many-body theory in condensed matter involves the choice of the most appropriate independent particles. Hence, it is essential to define clearly the particles that are created and annihilated by the operators and in which the many-body theory is formulated.
These two pages conclude a book of many chapters, that span an arc from fundamental theory to applications, from concepts to computation. This reflects an approach that the book is meant to promote: not the competition between research areas or methods, but the awareness that often exchange and combination leads to the most important advances.
As stated on the first pages of this book, the many-body interacting-electron problem ranks among the most fascinating and fruitful areas of research in science. It combines the intellectual challenges of quantum many-body physics with the opportunities to impact areas of science and technology, from engineering to biology, from archeology to astrophysics. There has been great progress that opens the door for the future, when all these disciplines can be greatly enhanced by quantitative calculations based on the fundamental laws of quantum mechanics.
To describe, understand, and predict the phenomena that are observed in the many-body world requires new concepts and ideas. At the same time, much of the progress in the past has been driven by the advances in computers. Very little described in this book on real materials would have been accomplished on computers from the 1970s, when many of the methods were invented. We expect this trend toward more available processing speed through parallel processing and memory to continue. Of course, advances in algorithms and software are also responsible for progress almost equally with advances in hardware. The field relies on a triangle formed by concepts, techniques, and tools: this triangle should be expanded in all directions in order to make progress.
The methods that translate concepts into feasible approaches and make best use of available hardware are at the heart of this book. We have described different ways to approach the problem. It is important to recognize that the various methods have different capabilities, so that a more complete picture of a given phenomenon or material can be obtained by using a variety of methods. The methods are complementary but not disjunct: there are many touching points that invite us to strive for possible combinations. Such an attitude has a long tradition in the field: Kohn–Sham DFT is used to construct trial wavefunctions for QMC, and QMC results of the homogeneous electron gas are used as input in DFT.
Recent progress in the theory and computation of electronic structure is bringing an unprecedented level of capability for research. It is now possible to make quantitative calculations and provide novel understanding of natural and man-made materials and phenomena vital to physics, chemistry, materials science, as well as many other fields. Electronic structure is indeed an active, growing field with enormous impact, as illustrated by the more than 10,000 papers per year.
Much of our understanding is based on mean-field models of independent electrons, such as Hartree–Fock and other approximations, or density functional theory. The latter is designed to treat ground-state properties of the interacting-electron system, but it is often also used to describe excited states in an independent-electron interpretation. Such approaches can only go so far; many of the most interesting properties of materials are a result of interaction between electrons that cannot be explained by independent-electron descriptions. Calculations for interacting electrons are much more challenging than those of independent electrons. However, thanks to developments in theory and methods based on fundamental equations, and thanks to improved computational hardware, many-body methods are increasingly essential tools for a broad range of applications. With the present book, we aim to explain the many-body concepts and computational methods that are needed for the reader to enter the field, understand the methods, and gain a broad perspective that will enable him or her to participate in new developments.
What sets this book apart from others in the field? Which criteria determine the topics included? We want the description to be broad and general, in order to reflect the richness of the field, the generality of the underlying theories, and the wide range of potential applications. The aim is to describe matter all the way from isolated molecules to extended systems. The methods must be capable of computing a wide range of properties of diverse materials, and have promise for exciting future applications. Finally, practical computational methods are an important focus for this book.
Choices have to be made since the number of different approaches, their variations, and applications is immense, and the book is meant to be more than an overview. We therefore cannot focus on such important areas as quantum chemistry methods, e.g. coupled cluster theory and configuration interaction methods, nor do we cover all of the developments in lattice models, or explore the vast field of superconductivity.