Published online by Cambridge University Press: 07 January 2010
Some of our clearest insights into how the mind constructs language come from the investigation of challenges to the sensory, motor, and/or neural mechanisms of the human brain. The study of stroke and diseases of the central nervous system has led to enormous, but still incomplete, knowledge about the neural architecture of human language. Investigation of the sign languages that spontaneously arise among individuals who are deaf has revolutionized psycholinguistic theory by demonstrating that human language capacity transcends sensory and motor modality. The studies we describe here follow in this long research tradition.
We have been investigating the gesture–speech relationship in individuals with chronic stuttering in order to gain insights into the nature of the relationship between the two in spontaneous expression. Stuttering, the involuntary and excessive repetition of syllables, sounds, and sound prolongations while speaking, is highly disruptive to the production of speech. This provides us with an opportunity to observe what happens to the temporal patterning of gesture against the backdrop of a fractionated speech stream. Our studies have garnered striking evidence that gesture production is, and moreover must be, integrated with speech production at a deep, neuromotor planning level prior to message execution. The expressive harmony of gesture patterning relative to speech patterning is so tightly maintained throughout the frequent and often lengthy speech disruptions caused by stuttering that it suggests a principle of co-expression governing gesture–speech execution (Mayberry, Jaques & Shenker 1999).