To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Abstract This chapter extends the treatment of the previous chapters to the case when the system is initially prepared in an ensemble average. This requires adding a “vertical track” to the oriented contour. Alternative formalisms are considered in this context, depending on the way the vertical track is dealt with. The relationship between transient phenomena and the adiabatic assumption is also considered.
This chapter considers the expectation value of an operator (or of products of two operators) over the ground state of the interacting system, when the time-dependent part of the Hamiltonian is switched off. These limitations apply to systems in equilibrium at zero temperature, which include important cases like insulators and semiconductors as well as Fermi liquids, for which the energy gap and the Fermi energy are, respectively, much larger than the available thermal energy. The ensuing formalism for ground-state averages at zero temperature relies on an “adiabatic assumption,” which cannot be applied as it is when excited states are involved in the ensemble averages.
In the theory of the contour-ordered Green’s functions, one encounters convolutions and products. The task of this chapter is to obtain the corresponding expressions in terms of the real-time functions. This task is accomplished in terms of the so-called Langreth–Wilkins rules, which are here discussed in detail for convolutions as well as for particle–hole-type and particle–particle-type products. A preliminary introduction to what is referred to as the Keldysh space is also provided.
In the Schrödinger and Heisenberg representations, the time-evolution operator depends on the full time-dependent Hamiltonian, which includes an external time-dependent potential. Out of this full-time dependence, it is useful to isolate the time dependences due to either the full system Hamiltonian or its noninteracting part. These two cases, referred to as the Heisenberg and interaction pictures, respectively, are considered separately.
The t-matrix approximation applies to a low-density (or dilute) Fermi gas with a short-range interparticle interaction, either attractive or repulsive. This chapter considers the nonequilibrium (time-dependent) version of the t-matrix approximation for fermions in the normal phase, in the perspective of applying it to the BCS–BEC crossover.
This chapter first recalls the time-independent Bogoliubov–deGennes equations for the equilibrium case and shows their equivalence to the Gor’kov approach for inhomogeneous fermionic superfluidity at equilibrium. It then considers the extension of the Bogoliubov–deGennes approach to the nonequilibrium case in the framework of the Kadanoff–Baym equations, once implemented at the mean-field level. Properties of the solutions are considered in detail.
This chapter introduces the contour Schwinger–Keldysh method for time-dependent averages, in light of its relevance to nonequilibrium processes. A key feature of this approach is that it leaves open the possibility that no state of a system in the future can be identified with any of its states in the past. This method is here illustrated in detail with reference to time-dependent quantum averages, whereby for definiteness the system is initially prepared at the reference time t₀ in a definite quantum state.
This chapter considers the closed-time-path Green’s functions approach when specified to equilibrium situations and shows that it offers an alternative to the more standard Matsubara plus analytic continuation procedure for obtaining physical quantities directly in real frequency. In this case, the number of independent components of the single-particle Green’s function (as well as of the related self-energy) reduces considerably, thereby making it easier to solve the Kadanoff–Baym equations. The fluctuation–dissipation theorem and the single-particle spectral function (with its related sum rule) are also considered.
Similar to Chapter 20 of Part I and Chapter 31 of Part II, this chapter considers the treatment of a few topics, which are relevant to the general purposes of the book, but whose inclusion in previous chapters would have diverted the discussion of the main topics of interest therein. Specifically, it addresses a schematic derivation of the Lindblad Master equation (aimed at helping the reader in retracing and better identifying the essential steps made and approximations adopted in the more general derivation presented in Chapter 35), as well as the physical assumptions underlying the original Kadanoff–Baym ansatz.
Quantum many-body systems are a central feature of condensed matter physics, relevant to important, modern research areas such as ultrafast light-matter interactions and quantum information. This book offers detailed coverage of the contour Green's function formalism – an approach that can be successfully applied to solve the quantum many-body and time-dependent problems present within such systems. Divided into three parts, the text provides a structured overview of the relevant theoretical and practical tools, with specific focus on the Schwinger-Keldysh formalism. Part I introduces the mathematical frameworks that make use of Green's functions in normal phase states. Part II covers fermionic superfluid phases with discussion of topics such as the BCS-BEC crossover and superconducting systems. Part III deals with the application of the Schwinger-Keldysh formalism to various topics of experimental interest. Graduate students and researchers will benefit from the book's comprehensive treatment of the subject matter and its novel arrangement of topics.
The chapter begins with discussion of intelligence in simple unicellular organisms followed by that of animals with complex nervous systems. Surprisingly, even organisms that do not have a central brain can navigate their complex environments, forage, and learn. In organisms with central nervous system, neurons and synapses in the brain provide elementary basis of intelligence and memory. Neurons generate action potentials that represent information. Synapses hold memory and control the signal transmission between neurons. A key feature of biological neural circuits is plasticity, that is, their ability to modify the circuit properties based both on stimuli and time intervals between them. This represents one form of learning. The biological brain is not static but continuously evolves based on the experience. The field of AI seeks to learn from biological neural circuitry, emulate aspects of intelligence and learning and attempts to build physical devices and algorithms that can demonstrate features of animal intelligence. Neuromorphic computing therefore requires a paradigm shift in design of semiconductors as well as algorithm foundations that are not necessarily built for perfection, rather for learning.
This chapter offers an in-depth discussion of various nanoelectronic and nanoionic synapses along with the operational mechanisms, capabilities and limitations, and directions for further advancements in this field. We begin with overarching mechanisms to design artificial synapses and learning characteristics for neuromorphic computing. Silicon-based synapses using digital CMOS platforms are described followed by emerging device technologies. Filamentary synapses that utilize nanoscale conducting pathways for forming and breaking current shunting routes within two-terminal devices are then discussed. This is followed by ferroelectric devices wherein polarization states of a switchable ferroelectric layer are responsible for synaptic plasticity and memory. Insulator–metal transition-based synapses are described wherein a sharp change in conductance of a layer due to external stimulus offers a route for compact synapse design. Organic materials, 2D van der Waals, and layered semiconductors are discussed. Ionic liquids and solid gate dielectrics for multistate memory and learning are presented. Photonic and spintronic synapses are then discussed in detail.