To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
By
Hugh T. Blair, Department of Psycology University of California 1285 Franz Hall Box 951563 Los Angeles, CA 90095-1563,
Karim Nader, Department of Psychology McGill University Canada Stewart Biological Sciences Building Room N8/8, 398-3511 1205 Dr Penfield Avenue Montreal, Quebec, H3 A 1B1,
Glenn E. Schafe, Department of Psychology and Interdisciplinary Neuroscience Program Yale University 2 Hillhouse Avenue New Haven, Connecticut 06511-6814,
Elizabeth P. Bauer, W. M. Keck Foundation Laboratory of Neurobiology Center for Neural Science 6 Washington Place, Room 276 New York University New York, NY 10003,
Sarina M. Rodrigues, W. M. Keck Foundation Laboratory of Neurobiology Center for Neural Science New York University New York, New York 10003,
Joseph E. LeDoux, University Professor; Professor of Neural Science and Psychology Center for Neural Science New York University 4 Washington Place, Room 809 New York, NY 10003
Classical fear conditioning is a form of associative learning in which subjects are trained to express fear responses to a neutral conditioned stimulus (CS) that is paired with an aversive unconditioned stimulus (US). As a result of such pairing, the CS comes to elicit behavioral, autonomic, and endocrine responses that are characteristically expressed in the presence of danger (Blanchard & Blanchard, 1969; Bolles & Fanselow, 1980; Smith et al., 1980). Fear conditioning has emerged as an especially useful behavioral model for investigating the neurobiological mechanisms of learning and memory, because fear memories are rapidly acquired and long-lasting, involve well-defined stimuli and responses, and depend upon similar neural circuits in different vertebrate species (see Davis & Lee, 1998; LeDoux, 2000; Maren, 1999; Rogan et al., 2001).
In this chapter, we review studies that have investigated the role of the amygdala in fear learning. We argue that neural plasticity in the lateral amygdala is critical for storing memories of the association between the CS and US during fear conditioning, and discuss how learning and memory are achieved at the cellular or molecular level. Alternative views of amygdala contributions to fear conditioning are also considered.
The amygdala and fear conditioning
Fear learning depends critically upon the amygdala (Davis & Shi, 2000; Fendt & Fanselow, 1999; LeDoux, 1996, 2000), a cluster of nuclei in the brain's temporal lobe that plays a key role in regulating emotions (Kluver & Bucy, 1939; LeDoux, 1996).
By
Jacques Mehler, Director Language, Cognition and Development Lab International School of Advanced Studies SISSA/ISAS CNS (ORO, rm 13) Via Beirut 4 34014 TriesteItaly,
Marina Nespor, University of Milan Bicocca Psychology Department Edificio U6 Piazza dell' Ateneo Nuovo 1-20126 Milano,
Marcela Peña, Cognitive Neuroscience Sector SISSA/ISAS Via Beirut 4 34014 TriesteItaly
Linguists, psychologists, and neuroscientists have studied language acquisition with the tools and models available to their respective fields. Linguists elaborated some of the most sophisticated theories to account for how this unique human competence arises in the infants' brains. Chomsky (1980) formulated the parameter setting theory (hereafter, PS) to account for how infants, on the basis of partial and noisy language input, acquire grammar. PS assumes that infants are born with “knowledge” of Universal Grammar (UG). This includes both genetically determined universal principles and binary parameters. Universal principles describe the properties common to all natural languages. Binary parameters capture the grammatical properties on which natural languages differ from one another. The linguistic input determines the particular value of a parameter. PS postulates that exposure to the surrounding language determines how the parameters of UG are set.
We acknowledge that PS has many virtues. It addresses the problem of language acquisition without making unjustified but common simplifications, for example, that imitation is the privileged mechanism responsible for the emergence of linguistic competence. The theory, furthermore, is quite appealing because it assumes, realistically, a biological perspective, namely, that the child is equipped with a species-specific mechanism to acquire natural language. Moreover, the PS theory has been formulated with sufficient detail and precision as to make it easy to falsify. In contrast, proposals that assume that language is acquired by means of a general learning device appear more difficult to support.
By
Lisa D. Sanders, Department of Psychology University of Massachusetts at Amherst Tobin Hall, 135 Hicks Way Amherst, MA 01003,
Christine M. Weber-Fox, Speech, Language, and Hearing Sciences Purdue University West Lafayette, IN 47907,
Helen J. Neville, Director Brain Development Lab; Professor Psychology and Neuroscience University of Oregon Eugene, Oregon 97403-1227
There are periods in development during which experience plays its largest role in shaping the eventual structure and function of mature language-processing systems. These spans of peak cortical plasticity have been called “sensitive periods.” Here, we describe a series of studies investigating the effects of delays in second language (L2) acquisition on different subsystems within language. First, we review the effects of the altered language experience of congenitally deaf subjects on cerebral systems important for processing written English and American Sign Language (ASL). Second, we present behavioral and electrophysiological studies of L2 semantic and syntactic processing in Chinese-English bilinguals who acquired their second language over a wide range of ages. Third, we review semantic, syntactic, and prosodic processing in native Spanish and native Japanese late-learners of English. These approaches have provided converging evidence, indicating that delays in language acquisition have minimal effects on some aspects of semantic processing. In contrast, delays of even a few years result in deficits in some types of syntactic processing and differences in the organization of cortical systems used to process syntactic information. The different subsystems of language which rely on different cortical areas, including semantic, syntactic, phonological, and prosodic processing, may have different developmental time courses that in part determine the different sensitive period effects observed.
Humans, in comparison to other animals, go through a protracted period of post-natal development that lasts at least 15 years (Chugani & Phelps, 1986; Huttenlocher, 1990).
Memory is a large topic, built on the fundamental idea that the experiences one has can change the nervous system, so that behavior and mental activity can later be different as a result of what came before. Yet, memory is more than a record of personal experience. Humans can learn and then teach what they have learned to others, thereby making it possible to transmit information from one generation to another.
In the twentieth century the study of memory became part of the domains of both biological and psychological science. Work has proceeded at several levels of analysis – from questions about the cellular and molecular events that underlie synaptic change to questions about complex behavior. Between these poles are other important questions, such as what brain systems are important for memory and how they operate to support memory. As we enter the new millennium, biology and psychology have converged on a number of fundamental questions about memory. Is memory one thing or many? If there are different kinds of memory, what are their operating characteristics? Where in the brain do the important events occur? Where is memory stored? What happens at the level of individual cells and synapses?
The modern era of memory research can be said to have begun in 1957 when the effects on memory of medial temporal lobe resection were described in a patient who became known as HM. HM exhibited profound forgetfulness against a background of largely intact intellectual and perceptual functions.
By
Seth J. Ramus, Department of Psychology and Program in Neuroscience Bowdoin College Brunswick, ME 04011,
Howard B. Eichenbaum, Director Cognitive Neurobiology Laboratory; Director Center for Memory and Brain; University Professor and Chairman Department of Psychology Boston University Center for Memory and Brain 2 Cummington Street Boston, MA 02215
Our understanding about the brain system that mediates memory began in the 1950s with the landmark case study of patient HM (Scoville & Milner, 1957). To relieve epilepsy that was intractable to pharmacological intervention, surgeons removed a large part of this patient's temporal lobes, including the amygdala, part of the hippocampus, and the cortex immediately surrounding the hippocampus and amygdala. Following surgery, HM exhibited a severe amnesia, leaving nonmemory aspects of intelligence and cognition intact. This observation demonstrated that memory could be separated from other cognitive functions and that structures of the medial temporal lobe are critical to memory.
While the early neuropsychological reports clearly pointed to the importance of the temporal lobes in memory, there was debate over precisely which temporal structures were important. Because the available clinical cases did not provide highly specific anatomical resolution, efforts were made to develop animal models in which experimental brain lesions could be performed with the necessary anatomical specificity. However, the early efforts to model amnesia in monkeys and rats did not yield a consistent pattern of severe and selective amnesia, precluding useful insights into the anatomical identification of the memory system. With hindsight, it is now clear that the difficulty in characterizing the brain system responsible for memory arose for two reasons (Eichenbaum et al., 2000). First, while the memory deficit following medial temporal damage was initially thought to be global in nature, it is now understood that damage to the medial temporal region causes amnesia that is limited to a specific domain of memory, and that other brain systems mediate other types of memory.
By
James R. Pomerantz, Professor/Director of Neuroscience Psychology Department (MS-25) Rice University 6100 Main Street PO Box 1892 Houston, TX 77005-1892 Office: 429 A Sewall Hall
Understanding of sensory processes has been an ongoing concern for centuries. Consider the study of vision, for example. The early Greeks knew that seeing began in the eye and that the eye was filled with fluid – the humors. But they had no good sense of the eye's optics or of the retina's role; indeed, they thought the retina's job was to provide nutrients for the vitreous. Aristotle believed the humors were photoreceptive. The Pythagoreans believed that rays emanated outward from the eye to the external environment, leading us to wonder today as to what explanation they had in mind for why the world gets dark at night.
In 1604, one hundred years before the publication of Newton's Optics, the German astronomer Johannes Kepler launched a new era in vision when he noted that the eye worked as an optical instrument focusing an image sharply on the retina, observed that the retinal image is in fact oriented upside down and backwards, determined that the lens refracted light, and pinpointed the cause of myopia (before Kepler, people used spectacles to improve their vision but had no idea why curved glass sharpened their sight). There was plenty even Kepler failed to grasp however, including the actual workings of the retina. The discovery of photoreceptors – rods and cones – was over a century away, and when Treviranus officially discovered them in 1834, he misread the orientation of the retina, believing it was installed backwards in the eye (surely the light-sensitive photoreceptors could not be pointing away from the source of light and toward the dark interior of the eye socket!) This echoes the belief, apparently held by some in eras long past, that there exists a second lens in the eye that rectifies the visual image – for as everyone knows, we do not see our world upside down and backwards.
By
Kazu Nakazawa, National Institutes of Health Genetics of Cognition and Behavior Unit, NIMH Porter Neuroscience Research Center Building 34, Room IC-915 35 Convent Drive, MSC 3710 Bethesda, MD 20892-3710,
Matthew A. Wilson, Center for Learning and Memory RIKEN-MIT Neuroscience Research Center Department of Brain & Cognitive Science and Biology Massachusetts Institute of Technology (46-5233) 77 Massachusetts Avenue Cambridge, MA 02139-4307,
Susumu Tonegawa, Director Picower Center for Learning and Memory Massachusetts Institute of Technology 77 Massachusetts Avenue Building E17, Room 353 Cambridge, MA 01239-4307
A full understanding of the mammalian brain mechanisms underlying a higher cognitive phenomenon like learning and memory requires identification of relevant events or processes occurring at multiple levels of complexity; from molecular, synaptic, and cellular levels to neuronal ensemble and brain systems levels. This is an enormous challenge for brain researchers because cognitive phenomena can be monitored only at the level of a live animal's behavior, while many of the analytical methods for the underlying mechanisms are carried out using in vitro preparations and effective in vivo methods are limited. How can we be sure that the events or processes identified by in vitro methods or by even some in vivo studies are causally related to the animals' behavioral phenotype? For simpler invertebrate systems, molecular genetics has been effective for this purpose. Organisms harboring a mutation in a specific gene can be subjected to a variety of in vitro and in vivo analyses including behavioral tests, and deficits or impairments detected at different levels of complexity can potentially be bound together using the mutation as a connecting thread.
Background
Experimental strategy
For the analysis of more complex mammalian systems, however, additional tricks are necessary. One significant trick would be to restrict the mutation spatially and temporally. For instance, if one can restrict deletion (i.e., null mutation) of a specific gene to a particular type of neuron present in a particular area of the brain and only to a late phase of the animal's life, one can expect that the resulting deficits or impairments would be much more specific.
By
Michael I. Posner, Professor Emeritus Psychology Department University of Oregon Eugene, Oregon 97403-1227,
Jin Fan, Department of Psychiatry Icahn Medical Institute 1425 Madison Avenue, Room 20-82 Mount Sinai School of Medicine One Gustave L. Levy Place, Box 1228 New York, NY 10029
Attention is relatively easy to define subjectively as in the classical definition of William James (1890) who said: “Everyone knows what attention is. It is the taking possession of the mind in clear and vivid form of one out of what seem several simultaneous objects or trains of thought.”
However, this subjective definition does not provide hints that might lead to an understanding of attentional development or pathologies. The theme of our chapter is that it is now possible to view attention much more concretely as an organ system. We follow the Webster dictionary definition of an organ system: “An organ system may be defined as differentiated structures in animals and plants made up of various cell and tissues and adapted for the performance of some specific function and grouped with other structures into a system.”
We believe that viewing attention as an organ system aids in answering many perplexing issues raised in cognitive psychology, psychiatry, and neurology. Neuroimaging studies have systemically shown that a wide variety of cognitive tasks can be seen as activating a distributed set of neural areas, each of which can be identified with specific mental operations (Posner & Raichle, 1994, 1998). Perhaps the areas of activation have been more consistent for the study of attention than for any other cognitive system. We can view attention as involving specialized networks to carry out functions such as achieving and maintaining the alert state, orienting to sensory events, and controlling thoughts and feelings.
The field of neuroscience is progressing so rapidly that even expressions such as “by leaps and bounds” fail to capture the pace of its growth. Questions that at one time were thought to be unanswerable – perhaps even unaskable – have now been asked and in some cases answered, and new questions once unthinkable are now asked matter-of-factly. Much of this acceleration is due to the maturing of the field – advances in techniques as well as in theory – fueled by an infusion of research support during the 1990s “Decade of the Brain” effort.
It is impossible to capture fully the sweep of discoveries and advances that emerged from that decade within the covers of a single volume. It is possible, however, to provide a sample of the best of that work, both as recognition of what has been accomplished during that period of time and since, and as a harbinger of what is surely to come as the pace of neuroscience shows no hint of slowing down.
Our goal in the present volume is to provide that sample through carefully chosen topics and even more carefully chosen researchers in those fields. Singling out the four most important problems in neuroscience is probably an unwise goal and is a surefire way to start an argument. That said, however, few would argue that the four featured here are anything less than powerful candidates for that inner circle: higher order perception; language; memory systems; and sensory processes.
The three chapters that follow this introduction all deal with aspects of visual perception related to the processing of scenes and the recognition of objects. There was a time when it was clear that higher order visual perception meant processing that took place in brain areas beyond the primary visual cortex. The primary visual cortex was thought to perform simple computations, each covering a small separate part of the visual world (receptive field) and hard wired in the sense that little could be done by learning or attention to modify them. This view stressed hierarchical processing among visual areas, particularly those from primary visual cortex V1 to the anterior temporal areas. Evidence for the hierarchical view is thoroughly summarized in the chapter by Kastner, De Weerd, and Ungerleider. However, all the three chapters deal in rather different ways with qualification to the hierarchical view of visual areas driven passively from the bottom up, based upon the influence of context, attention, and task demands.
In his chapter, Charles Gilbert describes the research work of his group, which has changed the view of how the primary visual cortex works. The older view gave rise to the hope that studies of primary visual cortex might provide the basic immutable building blocks from which it might be possible to launch an analysis of the remaining functions grouped under the title of higher perception.
The primary visual cortex is the first cortical stage at which the visual world is analyzed. It has classically been thought to be a passive filter, only deriving information about local contrast and orientation, and passing that on to later cortical stages for the more complex task of object recognition. But a very different view is now emerging, showing that V1 plays a central role in much more complex processes involving intermediate level vision, integrating contours and parsing the visual world into surfaces belonging to objects and their backgrounds. The higher order properties of cortical neurons are reflected in the dependence of their responses on the context within which features of the visual stimulus are embedded. In addition, the properties of neurons in V1 reflect an ongoing process of experience-dependent modification, known as “perceptual learning.” This process begins early in life, incorporating the structural properties of the visual world into the functional properties of neurons. It continues throughout adulthood, encoding information about different shapes with which individuals become familiarized. Superimposed upon the influence of context and experience is a powerful top-down modification of neuronal function, such that the properties exhibited by neurons change according to attentional state, expectation, and perceptual task.
The receptive field and cortical circuitry: contextual influences
The central functional element of sensory systems is the receptive field, the portion of the sensory surface (retina) or environment (visual field) within which a stimulus will cause a cell to fire.
By
Larry R. Squire, Professor of Psychiatry Neurosciences, and Psychology University of California 3350 La Jolla Village Drive San Diego, CA 92161,
Craig E. L. Stark, Assistant Professor Department of Psychological and Brain Sciences The Johns Hopkins University 204 Ames Hall 3400 N. Charles Street Baltimore, MD 21218
For all its diversity, one can view neuroscience as being concerned with two central issues – the hard wiring of the brain and the brain's capacity for plasticity. The former refers to how connections develop between cells, how cells function and communicate, and how an organism's inborn functions are organized (e.g., its sleep–wake cycles, hunger and thirst, and the ability to perceive the world). The nervous system has inherited such adaptations through evolution, because these are functions too important to be left to the vagaries of individual experience. In contrast, the capacity for plasticity refers to the fact that nervous systems can adapt or change as the result of experiences that occur during an individual lifetime. Experience can modify the nervous system, and as a result, organisms can learn and remember. Learning is the process by which new information is acquired about the world, and memory is the process by which this information can persist across time.
The scientific study of memory has reached a particularly fruitful stage. Memory is being studied at many levels of analysis – from questions about the cellular and molecular events that underlie synaptic change to questions about the organization of behavioral memory. This chapter considers memory from the perspective of brain systems and behavior and focuses on three topics (for recent reviews, see Squire & Bayley, 2007; Squire et al., 2004).
By
Michele M. Solis, 5733 26th Ave NE Seattle, WA 98105,
Neal A. Hessler, Keck Center for Integrative Neuroscience Department of Physiology Box 0444 University of California San Francisco, CA 94143-0444,
Charlotte A. Boettiger, Department of Psychology University of California 3210 Tolman Hall #1650 Berkeley, CA 94720-1650,
Allison J. Doupe, University of California UCSF, 513 Parnassus (HSE-818) Box 0444 San Francisco, CA 94143-0444
Birdsong, like human speech, is a learned vocal behavior that requires auditory feedback. Both as juveniles, while they learn to sing, and as adults, songbirds use auditory feedback to compare their own vocalizations with an internal model of a memorized target song. Here we describe experiments that explore the properties of the songbird anterior forebrain pathway (AFP), a basal ganglia–forebrain circuit known to be critical for normal song learning and for adult modification of vocal output, but not for normal adult singing. First, neural recordings in anesthetized, juvenile birds show that single AFP neurons become specialized to process the song stimuli that are compared during sensorimotor learning. AFP neurons develop tuning to the bird's own song, and in many cases to the tutor song as well, even when these stimuli are manipulated to be very different from each other. Second, neural recordings from adult, singing birds reveal robust singing-related activity in the AFP, which is present even in deaf birds. This activity is likely to originate from premotor areas, and could represent an efference copy of motor commands for song, predicting the sensory consequences of motor commands. Finally, in vitro studies of the AFP show that recurrent synapses between neurons in the AFP outflow nucleus can undergo activity-dependent and timing-sensitive strengthening that appears to be restricted to young birds.
Much of human social life depends on the notion that agents have control over their actions and are responsible for their choices. In daily life it is commonly assumed that it is fair to punish and reward behavior so long as the person is in control and makes choices knowingly and intentionally. Without the assumptions of agent control and responsibility, human social commerce is hardly conceivable. As members of a social species, we recognize co-operation, loyalty, honesty, and helping as prominent features of the social environment. We react with hostility when group members disappoint certain socially significant expectations. Inflicting disutilities (e.g., shunning, pinching) on the socially erring and rewarding civic virtue help restore the standards.
In other social species too, social unreliability, such as a failure to reciprocate grooming or food-sharing, provokes a reaction likely to cost the erring agent, sooner or later. In social mammals at least, mechanisms for learning and keeping the social order seem to be part of what evolution has bequeathed to our brain circuitry. Given that the stability of the social-expectation baseline is sufficiently important for survival, individuals are prepared to incur some cost in enforcing those expectations. Just as anubis baboons learn that tasty scorpions are to be found under rocks but cannot just be picked up, so they learn that failure to reciprocate grooming when it is duly expected may incur a slap.
By
Karalyn Patterson, Senior Scientist MRC Cognition and Brain Science Unit University of Cambridge 15 Chaucer Road Cambridge CB2 2EF UK,
Naida L. Graham, MRC Cognition & Brain Science Unit 15 Chaucer Road Cambridge CB2 2EF UK,
Matthew A. Lambon Ralph, The University of Manchester Oxford Road Manchester M13 9PL UK,
John R. Hodges, MRC Cognition & Brain Sciences Unit University of Cambridge 15 Chaucer Road Cambridge CB2 2EF UK
The human faculty of language is a breathtaking skill. It allows us to communicate observations, thoughts, wishes, intentions, emotions, etc., to another person in the same room (by speaking), to a person in the next room (by shouting), to someone thousands of kilometers away (by speaking on the telephone or sending a fax), and even to future generations (by writing stories, poems, books, or scientific articles). Language is characterized by almost infinite variation and creativity. Every person alive today (with the exception of pre-verbal infants and people with severely impaired language skills) probably utters a number of sentences every day that he or she has never produced before. What other form of behavior could compete with this for degree of novelty and originality?
Language is typically considered to involve a set of interacting, but somewhat separate, domains of ability or knowledge. These include the sound structure of the language (phonology); word meanings (semantics); the ways in which individual morphemes combine to create complex words (morphology); the ways in which morphologically simple or complex words combine to create phrases and sentences (syntax); and finally, at least in the relatively brief time since a substantial proportion of the world's population has become literate, knowledge of how words are written in the speaker's language (orthography).
How and where does the brain represent and process this complex set of abilities? Because language is unique to humans, we can only learn about this topic by studying humans.
The last decade of the twentieth century was unprecedented in its progress toward discoveries linking the anatomical structures and physiological systems of the brain to the human mind. This enterprise is possible now, both because of a large body of behavioral data characterizing the operations and subsystems within different domains of cognitive processing and because of great advances in the methods and techniques available to noninvasively image the structure and the physiology of the functioning human brain. The focus of this part is to consider different perspectives and approaches to the study of the brain systems important in language processing and in the development and differentiation of the language systems of the brain.
The study of language is particularly well poised to benefit from knowledge about underlying neural mechanisms. It has been recognized since the 1950s that the study of language is a model case for understanding the species-specific capacities of human learners and the brain mechanisms in human adults and infants that permit them. Language in humans is an extraordinary ability, showing many properties without parallel in other species; understanding the mechanisms underlying human language will therefore shed special light on human cognition. At the same time the lack of animal models that have made such powerful contributions to the characterization of nonlinguistic cognitive systems underscores the importance of the new noninvasive techniques for imaging the language systems of the human brain.