To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This textbook for advanced undergraduate and beginning graduate students provides a systematic introduction into the fields of neuron modeling, neuronal dynamics, neural coding, and neural networks. It can be used as a text for introductory courses on Computational and Theoretical Neuroscience or as main text for a more focused course on Neural Dynamics and Neural Modeling at the graduate level. The book is also a useful resource for researchers and students who want to learn how different models of neurons and descriptions of neural activity are related to each other.
All mathematical concepts are introduced the pedestrian way: step by step. All chapters are richly illustrated by figures and worked examples. Each chapter closes with a short summary and a series of mathematical Exercises. On the authors' webpage Python source code is provided for numerical simulations that illustrate the main ideas and models of the chapter (http://lcn.epfl.ch/~gerstner/NeuronalDynamics.html).
The book is organized into four parts with a total of 20 chapters. Part I provides a general introduction to the foundations of computational neuroscience and its mathematical tools. It covers classic material such as the Hodgkin–Huxley model, ion channels and dendrites, or phase plane analysis of two-dimensional systems of differential equations. A special focus is put on the firing threshold for the generation of action potentials, in the Hodgkin–Huxley models, as well as in reduced two-dimensional neuron models such as the Morris–Lecar model.
Neurons have intricate morphologies: the central part of the cell is the soma, which contains the genetic information and a large fraction of the molecular machinery. At the soma originate long wire-like extensions which come in two different flavors. First, the dendrites form a multitude of smaller or larger branches on which synapses are located. The synapses are the contact points where information from other neurons (i.e., “presynaptic” cells) arrives. Second, also originating at the soma, is the axon, which the neuron uses to send action potentials to its target neurons. Traditionally, the transition region between soma and axon is thought to be the crucial region where the decision is taken whether a spike is sent out or not.
The Hodgkin–Huxley model, at least in the form presented in the previous chapter, disregards this spatial structure and reduces the neuron to a point-like spike generator – despite the fact that the precise spatial layout of a neuron could potentially be important for signal processing in the brain. In this chapter we will discuss how some of the spatial aspects can be taken into account by neuron models. In particular we focus on the properties of the synaptic contact points between neurons and on the electrical function of dendrites.
The primary aim of this chapter is to introduce several elementary notions of neuroscience, in particular the concepts of action potentials, postsynaptic potentials, firing thresholds, refractoriness, and adaptation. Based on these notions a preliminary model of neuronal dynamics is built and this simple model (the leaky integrate-and-fire model) will be used as a starting point and reference for the generalized integrate-and-fire models, which are the main topic of the book, to be discussed in Parts II and III. Since the mathematics used for the simple model is essentially that of a one-dimensional linear differential equation, we take this first chapter as an opportunity to introduce some of the mathematical notation that will be used throughout the rest of the book.
Owing to the limitations of space, we cannot – and do not want to – give a comprehensive introduction to such a complex field as neurobiology. The presentation of the biological background in this chapter is therefore highly selective and focuses on those aspects needed to appreciate the biological background of the theoretical work presented in this book. For an in-depth discussion of neurobiology we refer the reader to the literature mentioned at the end of this chapter.
After the review of neuronal properties in Sections 1.1 and 1.2 we will turn, in Section 1.3, to our first mathematical neuron model. The last two sections are devoted to a discussion of the strengths and limitations of simplified models.
It is helpful to break neural data analysis into two basic problems. The “encoding” problem concerns how information is encoded in neural spike trains: can we predict the spike trains of a neuron (or population of neurons), given an arbitrary synaptic input, current injection, or sensory stimulus? Conversely, the “decoding” problem concerns how much we can learn from the observation of a sequence of spikes: in particular, how well can we estimate the stimulus that gave rise to the spike train?
The problems of encoding and decoding are difficult both because neural responses are stochastic and because we want to identify these response properties given any possible stimulus in some very large set (e.g., all images that might occur in the world), and there are typically many more such stimuli than we can hope to sample by brute force. Thus the neural coding problem is fundamentally statistical: given a finite number of samples of noisy physiological data, how do we estimate, in a global sense, the neural codebook?
This basic question has taken on a new urgency as neurophysiological recordings allow us to peer into the brain with ever greater facility: with the development of fast computers, inexpensive memory, and large-scale multineuronal recording and high-resolution imaging techniques, it has become feasible to directly observe and analyze neural activity at a level of detail that was impossible in the twentieth century.
In this final chapter, we combine the dynamics of single neurons (Parts I and II) and networks (Part III) with synaptic plasticity (Chapter 19) and illustrate their interaction in a few applications.
In Section 20.1 on “reservoir computing” we show that the network dynamics in random networks of excitatory and inhibitory neurons is sufficiently rich to serve as a computing device that buffers past inputs and computes on present ones. In Section 20.2 we study oscillations that arise in networks of spiking neurons and outline how synaptic plasticity interacts with oscillations. Finally, in Section 20.3, we illustrate why the study of neuronal dynamics is not just an intellectual exercise, but might, one day, become useful for applications or, eventually, benefit human patients.
Reservoir computing
One of the reasons the dynamics of neuronal networks are rich is that networks have a nontrivial connectivity structure linking different neuron types in an intricate interaction pattern. Moreover, network dynamics are rich because they span many time scales. The fastest time scale is set by the duration of an action potential, i.e., a few milliseconds. Synaptic facilitation and depression (Chapter 3) or adaptation (Chapter 6) occur on time scales from a few hundred milliseconds to seconds. Finally, long-lasting changes of synapses can be induced in a few seconds, but last from hours to days (Chapter 19).
In the previous chapter it was shown that an approach based on membrane potential densities can be used to analyze the dynamics of networks of integrate-and-fire neurons. For neuron models that include biophysical phenomena such as refractoriness and adaptation on multiple time scales, however, the resulting system of partial differential equations is situated in more than two dimensions and therefore difficult to solve analytically; even the numerical integration of partial differential equations in high dimensions is slow. To cope with these difficulties, we now indicate an alternative approach to describing the population activity in networks of model neurons. The central concept is expressed as an integral equation of the population activity.
The advantage of the integral equation approach is four-fold. First, the approach works for a broad spectrum of neuron models, such as the Spike Response Model with escape noise and other Generalized Linear Models (see Chapter 9) for which parameters can be directly extracted from experiments (see Chapter 11). Second, it is easy to assign an intuitive interpretation to the quantities that show up in the integral equation. For example, the interspike interval distribution plays a central role. Third, an approximative mathematical treatment of adaptation is possible not only for the stationary population activity, but also for the case of arbitrary time-dependent solutions. Fourth, the integral equations provide a natural basis for the transition to classical “rate equations,” which will be discussed in Chapter 15.
The common fruit fly - Drosophila melanogaster - has been the subject of genetics research since the early twentieth century. The complete genomic sequence of Drosophila was published in 2000 and it is still the model organism par excellence for the experimental study of biological phenomena and processes. It is also by far the best model for studying gene function in mammals, including humans. Presenting state-of-the-art studies on the behaviour of Drosophila, this volume discusses normal and pathological models of neurobehavioral disorders and encompasses the specialised methods that have been used, from anatomical, histological, immunohistological and neurophysiological to genomic, genetic and behavioural assays. A comprehensive and thorough reference, this volume is a valuable resource for students and researchers alike across several disciplines of life sciences, including behavioral genetics, neurogenetics, behavioral neuroscience, molecular biology, evolutionary biology and population biology.