To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In what follows we shall review the hormonal control of water and sodium appetite. The greatest emphasis will be on sodium ingestion, for it reflects my own research interests. The first major section is on angiotensin-induced water and sodium appetite, with discussion of the sites of action. The second major section discusses corticosteroid-induced sodium appetite and the sites of action. The third major section deals with stress-induced (or corticotropin-releasing-hormone- and corticosterone-induced) sodium intake and the sites of action. The fourth major section concerns angiotensin- and corticosteroid-induced water and sodium appetite and the sites of action. The fifth section deals with atrial natriuretic peptide and inhibition of water and sodium ingestion. The final section depicts an anatomic circuit that may underlie the cravings for water and sodium.
Sodium Hunger
Sodium appetite, in addition to thirst, provides the backbone of extracellular fluid regulation at a behavioral level of analysis. It has been known since the turn of the century that loss of extracellular fluid can generate thirst; it occurs when one bleeds or sweats under natural circumstances (Fitzsimons, 1979; Denton, McKinley, and Weisinger, 1996). Later it was discovered that sodium hunger is also expressed under these conditions (Wolf, 1969b; Denton, 1982). The hormones that regulate extracellular fluid balance have been preserved across evolution in terrestrial vertebrates (Denton, 1965). Sodium ingestion is linked to extracellular fluid balance. Thus the hunger for sodium provides a window into extracellular fluid regulation (Denton, 1982).
A set of core concepts has organized the contents of this book, providing a perspective on the science of hormones, brain, and behavior. One of these is the notion that underlying many behaviors carried out by animals, including ourselves, are central motive states that are influenced by hormones acting on the brain (Lashley, 1938; Beach, 1942; Stellar, 1954). That is, hormones induce and sustain central states that prepare an animal to perceive the world in characteristic ways and then act accordingly.
Recall the Introduction: Hormonally facilitated central motive states can be divided into two phases. The first is the appetitive phase – searching for what is desired (e.g., sodium) and avoiding what is aversive (e.g., a predator). The second is the consummatory phase, in which the animal satisfies its desire (e.g., by ingesting sodium or by feeling safe). The organization of the behaviors is reflected in the functional roles that hormones play in the brain to generate behaviors that will maintain the internal milieu and respond to problems in the external world. Steroid and peptide hormones activate and sustain central motive states.
Herbert (1993) has produced a beautiful monograph depicting different contexts in which steroids and peptides interact to regulate behavior (see also Hoebel, 1988). Most of these have been discussed in this book. They range from ingesting food, water, and sodium to maternal behavior, fear, and aggression. Being economical, nature uses the same hormones to generate a variety of different central states.
One of evolution's most successful adaptations is the self-generated rhythmicity (endogenous rhythmicity) with which organisms adjust to periodic influences in their environments, including diurnal, lunar, seasonal, and tidal cycles. By measuring the passage of time, endogenous biological clocks regulate both behavior and physiology, with one complementing the other. There are several kinds of biological clocks, and their anatomic locations and functions vary across species.
In most species, a behavior alternates between an active phase and a restful phase. Recall that during active periods the concentrations of hormones such as cortisol are elevated. The active phase reflects energy use, and the rest phase energy conservation. However, hormones such as melatonin, which is secreted at night, can be elevated during the active phase or the rest phase, depending upon the presence of light/darkness. In rodents, two oscillators have been hypothesized: One oscillator is synchronized to dusk, and the other to dawn, and the two are entrained to one another (e.g., Pittendrigh and Daan, 1976).
In most animal species, sleep is regulated both by circadian (daily) mechanisms and by mechanisms that maintain homeostasis. One mechanism monitors time, while the other monitors the need for sleep. In some cases the circadian clock may facilitate wakefulness against the opposing homeostatic need to remain asleep (Edgar et al., 1992). Sleep deprivation acts on the homeostatic mechanisms for sleep; within limits, the more sleep an animal loses, the more it will need to make up.
A schizophrenic man is convinced that the CIA performed cardiac surgery on him on the army and gave him a lizard's heart; now his blood pumps through only three chambers, which means any medication he takes is going to be very dangerous for him.
A lonely erotomanic woman tells an elaborate story of how California politician, Jerry Brown, secretly fell in love with her when she wrote him a fan letter 20 years ago; Brown still signals his passion to her whenever he is photographed in profile.
Most of us would agree that these are examples of delusional thinking, one of ‘the main psychiatric phenomena’ encountered by clinicians in everyday practice (Brockington, 1991). In keeping with Jaspers' classic description of half a century ago (1946), these examples show the hallmark characteristics of subjective certainty, incorrigibility, and impossible (or improbable) content. Until now, clinical psychiatry has been preoccupied with only one of these features, namely the content – and thus we focus on whether a patient's delusion is bizarre, on whether it is well systematized, on whether it is paranoid or somatic or grandiose. Yet any experienced clinician also knows two other perplexing things about delusions. The first is that ‘however sharply they may be defined logically, they are not sharply demarcated from normal thinking’ (Brockington, 1991, p. 42). The second is that delusional thinking is more than the sum of its contents.
The recent shift in psychiatry from a predominantly psychodynamic model towards a neurobiological paradigm has led to important advances in our understanding and management of many mental disorders. At the same time, this shift has been characterized as a move from a brainless psychiatry to a mindless one (Lipowski, 1989). Certainly, the continued existence of different psychiatric schools with widely divergent approaches to psychopathology and its treatment suggests that psychiatry continues to lack an adequate theoretical underpinning.
During the same time that psychiatry has undergone a paradigm shift, academic psychology has also experienced a revolution – the so-called cognitive revolution against behaviorism (Gardner, 1985). Cognitive science, a multidisciplinary arena encompassing cognitive psychology, artificial intelligence, neuroscience, linguistics, anthropology, and philosophy, and based on computational models of the mind, is now a predominant approach. Not surprisingly, clinicians have asked whether the constructs and methods of cognitive science are also applicable to psychopathology.
Indeed, a promising dialogue between clinical and cognitive science has emerged (Stein and Young, 1992). Both cognitive-behavioral therapists and psychodynamic researchers have increasingly drawn on cognitivist work in their theoretical and empirical studies of psychopathology and psychotherapy. Schema theory, for example, has been applied to a range of clinical phenomena (Stein, 1992). Such cognitivist work is often immediately attractive to the clinician insofar as it incorporates a range of theoretical disciplines and insofar as it is based on hard empirical studies. One of the most important developments in modern cognitive science has been connectionism, the field concerned with neural network models (Rumelhart et al., 1986a).
For more than 300 years, delusions have been defined as ‘pathological beliefs’ (Berrios, 1996; Spitzer, 1990; see Table 7.1 for clinical examples). During the first half of the nineteenth century, Baillarger reinforced this view by suggesting that ‘form’ be distinguished from ‘content’ (Berrios, 1994). Analysis of content (i.e. of the semantics of ‘belief’) has since then generated clinical subtypes (e.g. depressive versus schizophrenic delusions) (Sérieux and Capgras, 1909; Jaspers, 1963; Moor and Tucker, 1979; Sims, 1988), supported psychoanalytical interest in symbols, and (more recently) encouraged correlational research, for example with biographical data, particularly amongst those interested in attribution theory (Bentall, 1994). The ‘pathological belief’ view has, in general, been less useful to the neurobiological study of delusions (Berrios, 1991; Fuentenebro and Berrios, 1995). Analysis of the form of delusions has also been useful in some cases. For example, it has led to stable diagnostic categories (Schneider, 1959; Jaspers, 1963), and to multidimensional approaches, whose main consequence has been the erosion of the old, categorical, ‘all-or-none’ view. Whether from the perspective of content or of form, most research has been cross-sectional and hence uninformative on how delusions actually change with time. Notable exceptions to this approach have been the work of Kendler, Glazer and Morgenstern (1983) and Garety and Hemsley (1994).
The multidimensional approach is not free from problems. For example, some dimensions of delusions such as ‘bizarreness’ remain ambiguous (Monti and Stanghellini, 1993). Borrowed from Kurt Schneider (Goldman et al., 1992), the term originally meant ‘ absurd ’, ‘ impossible ’ or ‘ contrary to common knowledge ’ (e.g. as in DSM-III-R: APA, 1987).
Psychiatric diagnosis has been conceptualized as either a ‘one-off’ (‘recognition’) type of cognitive act or as a ‘recursive (constructional) process’. History teaches us that scientists choose their models not on the basis of some ‘crucial empirical test’ (such tests do not exist at this level of abstraction) but on the more humdrum (but rarely owned up to) dictate of fashion. For example, during the eighteenth century, when the so-called ‘ontological’ notions of disease (as it was then based on the more botanico tradition) (Berg, 1956; López-Piñero, 1983), reigned supreme, there was little problem in accepting the view that the diagnosis (recognition) of disease happened at one fell (cognitive) swoop. This was because the Platonic (ontological) assumption lurking behind such a notion amply justified the belief that disease was ‘fully bounded and out there’ and, furthermore, that inklings of its existence had been planted at birth (like everything else) in the mind of the diagnostician. The a priori privileging of some features of a disease (the successful strategy that Linné had already tried on plants), and the view that such features actually ‘signified’ the disease, was just one version of the ontological approach. Indeed, a century earlier, a similar view had governed the study of linguistics (Aarsleff, 1982). That it was fashion and Zeitgeist that sustained the popularity of the ontological view is illustrated by the fact that a rival approach put forward at the time by Adanson was given short shrift (Stevens, 1984).
The use of neural networks for the study of psychopathological phenomena is not new, but rather has a rich historical tradition. One striking feature of this tradition is that from the very inception of the idea of the neuron, psychiatrists have used the notion of networks and their pathology to account for psychopathological phenomena. Moreover, many advances in neural network research were either made by psychiatrists or were put forward in relation to psychopathology. In other words, neural network studies of psychopathological phenomena are by no means a ‘late add on’ to the mainstream of neural network research, but have always been at the heart of the matter. Neural networks were drawn by Freud and Exner to explain disorders of cognition, affect and consciousness. Carl Wernicke (1906) coined the term ‘Sejunktion’ to denote the functional decoupling of cortical areas, which he suggested was the cause of psychotic symptoms such as hallucinations and delusions. Emil Kraepelin (1892) and his group of experimentally oriented psychiatrists reasoned about associative networks and psychosis. Not least, Eugen Bleuler (1911/1950) – inspired by the experimental work carried out by his resident Carl-Gustav Jung (1906/1979) – saw the disruption of these networks as the hallmark of schizophrenia.
Various developments in many fields have contributed to the present surge of neural network research. The historical material discussed in this chapter is organized by time and by topic.
Present-day connectionism takes it for granted that there are neurons, summing up action potentials coming via the connections among them, the synapses. These fundamental ideas, however, have their history, and are the starting point of this chapter.
According to Aristotle, ‘to be learning something is the greatest of pleasures not only to the philosopher but also to the rest of mankind.’ But even as he affirms the unbounded human capacity for integrating new experience with existing knowledge, he alludes to a significant exception: ‘The sight of certain things gives us pain, but we enjoy looking at the most exact images of them, whether the forms of animals which we greatly despise or of corpses.’ Our capacity for learning is happily engaged in viewing representations of painful objects, but not, it seems, in viewing the objects themselves. When an experience is intensely painful, what then is a rational animal to do? We can neither disable our learning process, nor erase its traces. In the face of intense pain, horror, or terror, learning and remembrance cause no pleasure but rather persistent psychological pain and disruption. The memorious mind reverberates with trauma.
The traumatized mind responds in diverse ways to the recurrent crises of reminiscence, responses which lead at the extreme to the symptoms of various disorders. These reactions fall into two broad categories: the associative and the dissociative. The first is exemplified by some (but not all) of the symptoms of post-traumatic stress disorder, in cases in which even a trivial element associated with the painful event becomes an evocative cue for reliving the experience. In contrast, dissociation is characterized by subjective distancing from the initial pain and its remembrance, often with secondary effects. In dissociative amnesia, for example, subjects fail to recall critical spans of their lives, often seeming to obliterate the traumatic memory.
Goldilocks … dipped a spoon into Father Bear's bowl, but the porridge in it was too hot… Then she tried some from Mother Bear's bowl, but that was too cold. The porridge in Baby Bear's bowl was just right…
Robert Southey, Goldilocks and the Three Bears
Even though autism is a relatively infrequent disorder, occurring in about 1.5 to 2 cases/1000 in the population (Sugiyama and Abe, 1989), it has attracted the attention of many researchers since the time of Kanner's (1943) initial description of the syndrome. This curiosity reflects, in part, the fact that the bizarre and puzzling behaviors shown by individuals with autism present a challenge to theorists. More urgently, the fact that age-appropriate learning and social–communicative behavior is not present in these children has a devastating impact on their families and on the children's later social, emotional and cognitive development. Understanding the biological mechanisms responsible for autism may help to shed light on the best way to treat this syndrome.
Autism has an age of onset that is, typically, between 12 and 30 months of age, although some mothers report noticing abnormalities in their child's behavior from birth. The first behavioral disturbances noted include lack of response to the child's name being called, acting as if deaf, despite other evidence of apparent normal hearing (e.g., dashing to the kitchen from another room when a candy bar is unwrapped); failure to anticipate being picked up; failure to cuddle when held; poor eye contact and lack of interest in social interaction; failure to use normal gestures such as pointing to communicate (instead pulling others to desired objects);
This volume of essays on neural networks and psychopathology is aimed at an unusually diverse audience. On the one hand, we hope that the volume will be read by psychiatrists, psychologists, and other clinicians and researchers interested in psychopathology and its treatment. On the other hand, we hope that it will be read by those who work in the fields of cognitive science and artificial intelligence, and particularly those interested in neural network or connectionist models.
We believe that it is timely for clinicians and computational modellers to be in closer contact.While recent decades have seen dramatic advances in pharmacological and psychological treatments of psychiatric disorders, clinical science often lacks an adequate theoretical framework for integrating neurobiological and psychological data. Conversely, while neural networks have been tremendously successful in modelling a range of important psychological phenomena and in analysing data from a wide range of other sciences, less work has focused on connectionist models of psychopathology.
Neural network models of psychopathology have immediate theoretical and empirical appeal. They are theoretically interesting because they seem to incorporate neurobiological and psychological data in a seamless model of the way in which representational processes emerge from assemblies of neuron-like processing elements. They are empirically useful because they have been able to allow rigorous and elegant simulations of such uniquely human phenomena as pattern recognition, categorization, and learning; simulations that have in turn led to new insights into the phenomena under study.
In recent years there has been a dramatic revolution in our conceptualization of obsessive–compulsive disorder (OCD). OCD has long been considered a prototypical psychogenic condition, one that allowed an important window onto the workings of the unconscious mind. The disorder was thought to be relatively uncommon and refractory to treatment. In the last decade or so, however, advances in the neurobiology of OCD have led to a view that this disorder is best understood as one of the neuropsychiatric disorders, with specific brain dysfunction underlying complex behavioural symptoms. Furthermore, OCD is now recognized to be one of the most common psychiatric disorders (Karno et al., 1984; Weissman et al., 1994), and the introduction of novel pharmacotherapeutic and psychotherapeutic interventions has significantly improved its outcome (Baer and Minichiello, 1990; Jenike, 1992).
One of the most interesting aspects of current research on OCD is the new perspective that is being brought to questions about brain–behaviour relationships. Clearly, patients with OCD suffer from psychological symptoms, with anxiety-provoking intrusive thoughts (obsessions) leading to repetitive and ritualistic responses (compulsions). Functional imaging studies, however, demonstrate that these symptoms are mediated by specific dysfunctional brain circuits. Of significant interest is that both medication and psychotherapy lead to normalization of these circuits. Thus, while OCD may involve brain dysfunction, a comprehensive understanding of the condition also requires attention to brain-based emergent psychological structures and processes (Stein and Hollander, 1992).
In order to think about and further study this kind of integration of biological and behavioural data, clinicians and researchers may find it useful to draw on the theoretical constructs and empirical methods of cognitive science.
Psychopharmacological models have been developed from the two traditions now known as artificial neural networks and computational neuroscience. Artificial neural networks are based on primitive computing elements that are arranged to provide a brain-like architecture for information processing that contrasts with symbolic accounts of mental function. Computational neuroscience developed from mathematical models of phenomena at the level of the single neuron. Psychopharmacological models are on a spectrum between these two approaches, both of which have potential weaknesses. Artificial neural network models may include too many simplifying assumptions accurately to reflect pharmacological effects. Conversely, a model that incorporates too much cellular detail will be too complex to be useful in providing an explanation of network behaviour. This is reflected in the functions of these two types of model. Detailed models generally aim to replicate the causal mechanisms of a network and seek explanatory status through simplification. Artificial neural networks are used in a more limited fashion as hypothesis-generating tools. Available computing power leads to a trade-off between the size of a network and the amount of detail included. However, increasing power is leading to a convergence in the modelling process. The simplifications involved in model abstraction can be increasingly assessed against the behaviour of networks of much more detailed and biologically realistic neurons.
Psychopharmacology lacks a theoretical framework relating events at the level of the neuron to those at higher levels of central nervous system organization. Despite a wealth of detail on the cellular and behavioural effects of psychotropic drugs, the relation between the two remains obscure.
This volume has provided many examples of how connectionist models may allow clinicians to replace vague and nonquantitative approaches to psychopathology with a more sophisticated and quantitative paradigm. Nevertheless, several challenges remain for clinicians and researchers interested in consolidating the intersection between connectionism and psychiatry. In this closing contribution, a number of these challenges are discussed.
The challenge of education
The first challenge for neural network modelers is to become included in the mainstream of general psychiatry. It may be argued that neural networks look more mathematical than they are on a practical level. Nevertheless, neural networks may involve more mathematics than many psychiatrists are willing to countenance. At least some preparation is required for comprehension.
However, there is reason to be optimistic that the challenge of education will be met. When the author first presented a grand rounds on neural networks in 1990, he found few psychiatric residents had any computer preparation. Since then, the wave of children who have grown up with computers has reached our residencies, and most are now willing to consider these logicomathematical structures. The author is confident that in the future a working knowledge of neurocomputation will be pushed earlier and earlier in general education, partly because its applications will be everywhere.
We can help by translating neural network models into verbal structures and by beginning to speak during our clinical rounds in the metaphors of neurocomputing. As discussed below, these metaphors are most compatible with a dynamic psychiatry that is cognitively informed. And as their currency grows, computers offer metaphors for many processes of thought.