To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The temporal structure of speech provides crucial information to listeners for comprehension: In particular, the slow modulations in the amplitude envelope constitute important landmarks to discretize the continuous signal into linguistic units. Contemporary models of speech perception attribute a major functional role to brain rhythmic activity in this process: By aligning their phase to the quasi-periodic patterns in speech, neural oscillations would facilitate speech decoding. We here review evidence from EEG/MEG studies showing neural theta-range (~4–8 Hz) tracking of syllabic rhythm, with a special interest in speech rate variations. We also discuss to what extent neural oscillatory coupling contributes to, and is in turn modulated by, speech intelligibility, namely whether it is only acoustically or also linguistically guided. We finally review some findings showing that in addition to auditory cortex, motor regions play an active role in the oscillatory dynamic underlying speech processing.
Recent studies have shown that neural activity tracks the syntactic structure of phrases and sentences in connected speech. This work has sparked intense debate, with some researchers aiming to account for the effect in terms of the overt or imposed prosodic properties of the speech signal. In this chapter, we present four types of arguments against attempts to explain putatively syntactic tracking effects in prosodic terms. The most important limitation of such prosodic accounts is that they are architecturally incomplete, as prosodic information does not arise in speech autonomously. Prosodic and syntactic structure are interrelated, so prosodic cues are informative about the intended syntactic analysis, and syntactic information can be used to aid speech perception. Rather than trying to attribute neural tracking effects exclusively to one linguistic component, we consider it more fruitful to think about ways in which the interaction between the components drives the neural signal.
Recommend this
Email your librarian or administrator to recommend adding this to your organisation's collection.