Skip to main content Accessibility help
×
Hostname: page-component-76fb5796d-skm99 Total loading time: 0 Render date: 2024-04-25T11:26:09.172Z Has data issue: false hasContentIssue false

Multisensory Interactions in the Real World

Published online by Cambridge University Press:  08 August 2019

Salvador Soto-Faraco
Affiliation:
Universitat Pompeu Fabra, Barcelona
Daria Kvasova
Affiliation:
Universitat Pompeu Fabra, Barcelona
Emmanuel Biau
Affiliation:
University of Birmingham
Nara Ikumi
Affiliation:
Universitat Pompeu Fabra, Barcelona
Manuela Ruzzoli
Affiliation:
Universitat Pompeu Fabra, Barcelona
Luis Morís-Fernández
Affiliation:
Universitat Pompeu Fabra, Barcelona
Mireia Torralba
Affiliation:
Universitat Pompeu Fabra, Barcelona

Summary

The interactions between the senses are essential for cognitive functions such as perception, attention, and action planning. Past research helped understanding of multisensory processes in the laboratory. Yet, the efforts to extrapolate these findings to the real-world are scarce. Extrapolation to real-world contexts is important for practical and theoretical reasons. Multisensory phenomena might be expressed differently in real-world settings compared to simpler laboratory situations. Some effects might become stronger, others may disappear, and new outcomes could be discovered. This Element discusses research that uncovers multisensory interactions under complex environments, with an emphasis on the interplay of multisensory mechanisms with other processes.
Get access
Type
Element
Information
Online ISBN: 9781108578738
Publisher: Cambridge University Press
Print publication: 22 August 2019

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Alais, D., & Burr, D. (2004). The ventriloquist effect results from near-optimal bimodal integration. Current Biology, 14(3), 257–62.Google Scholar
Alsius, A., Möttönen, R., Sams, M. E., Soto-Faraco, S., & Tiippana, K. (2014). Effect of attentional load on audiovisual speech perception: evidence from ERPs. Frontiers in Psychology, 5(JUL). https://doi.org/10.3389/fpsyg.2014.00727Google Scholar
Alsius, A., Navarra, J., Campbell, R., & Soto-Faraco, S. (2005). Audiovisual integration of speech falters under high attention demands. Current Biology, 15(9), 839–43. https://doi.org/10.1016/j.cub.2005.03.046Google Scholar
Alsius, A., Navarra, J., & Soto-Faraco, S. (2007). Attention to touch weakens audiovisual speech integration. Experimental Brain Research, 183(3), 399404. https://doi.org/10.1007/s00221-007–1110–1Google Scholar
Alsius, A., Paré, M., & Munhall, K. G. (2018). Forty years after hearing lips and seeing voices: the McGurk effect revisited. Multisensory Research, 31(1–2), 111–44. https://doi.org/10.1163/22134808–00002565Google Scholar
Alsius, A., & Soto-Faraco, S. (2011). Searching for audiovisual correspondence in multiple speaker scenarios. Experimental Brain Research, 213(2–3). https://doi.org/10.1007/s00221-011–2624–0Google Scholar
Amedi, A., von Kriegstein, K., van Atteveldt, N. M., Beauchamp, M. S., & Naumer, M. J. (2005). Functional imaging of human crossmodal identification and object recognition. Experimental Brain Research, 166(3–4), 559–71. https://doi.org/10.1007/s00221-005–2396–5Google Scholar
Andersen, T. S., & Mamassian, P. (2008). Audiovisual integration of stimulus transients. Vision Research, 48(25), 2537–44. https://doi.org/10.1016/J.VISRES.2008.08.018CrossRefGoogle ScholarPubMed
Andersen, T. S., Tiippana, K., Laarni, J., Kojo, I., & Sams, M. (2009). The role of visual spatial attention in audiovisual speech perception. Speech Communication, 51(2), 184–93. https://doi.org/10.1016/J.SPECOM.2008.07.004CrossRefGoogle Scholar
Andersen, T. S., Tiippana, K., & Sams, M. (2004). Factors influencing audiovisual fission and fusion illusions. Cognitive Brain Research, 21(3), 301–8.Google Scholar
Arikan, B. E., van Kemenade, B. M., Straube, B., Harris, L. R., & Kircher, T. (2017). Voluntary and involuntary movements widen the window of subjective simultaneity. I-Perception, 8(4), 204166951771929. https://doi.org/10.1177/2041669517719297Google Scholar
Arnal, L. H., Doelling, K. B., & Poeppel, D. (2015). Delta–Beta coupled oscillations underlie temporal prediction accuracy. Cerebral Cortex, 25(9), 3077–85. https://doi.org/10.1093/cercor/bhu103Google Scholar
Arnal, L. H., & Giraud, A.-L. (2012). Cortical oscillations and sensory predictions. Trends in Cognitive Sciences, 16(7), 390–8. https://doi.org/10.1016/J.TICS.2012.05.003Google Scholar
Azañón, E., & Soto-Faraco, S. (2008). Changing reference frames during the encoding of tactile events (DOI:10.1016/j.cub.2008.06.045). Current Biology, 18(16). https://doi.org/10.1016/j.cub.2008.08.008Google Scholar
Bach-y-Rita, P., Danilov, Y., Tyler, M. E., & Grimm, R. J. (2005). Late human brain plasticity: vestibular substitution with a tongue BrainPort human–machine interface. Intellectica. Revue de l’Association Pour La Recherche Cognitive, 40(1), 115–22. https://doi.org/10.3406/intel.2005.1362Google Scholar
Bach-y-Rita, P., & Kercel, S. W. (2003). Sensory substitution and the human–machine interface. Trends in Cognitive Sciences, 7(12), 541–6. https://doi.org/10.1016/J.TICS.2003.10.013Google Scholar
Beauchamp, M. S., Argall, B. D., Bodurka, J., Duyn, J. H., & Martin, A. (2004). Unraveling multisensory integration: patchy organization within human STS multisensory cortex. Nature Neuroscience, 7(11), 1190–2. https://doi.org/10.1038/nn1333Google Scholar
Beauchamp, M. S., Lee, K. E., Argall, B. D., & Martin, A. (2004). Integration of auditory and visual information about objects in superior temporal sulcus. Neuron, 41(5), 809–23. https://doi.org/10.1016/S0896–6273(04)00070–4Google Scholar
Beierholm, U. R., Quartz, S. R., & Shams, L. (2009). Bayesian priors are encoded independently from likelihoods in human multisensory perception. Journal of Vision, 9(5), 23. https://doi.org/10.1167/9.5.23Google Scholar
Benoit, M. M., Raij, T., Lin, F., Jääskeläinen, I. P., & Stufflebeam, S. (2010). Primary and multisensory cortical activity is correlated with audiovisual percepts. Human Brain Mapping, 31(4), 526–38.CrossRefGoogle ScholarPubMed
Bernstein, L. E., Auer, E. T., Wagner, M., & Ponton, C. W. (2008). Spatiotemporal dynamics of audiovisual speech processing. NeuroImage, 39(1), 423–35. https://doi.org/10.1016/J.NEUROIMAGE.2007.08.035Google Scholar
Bernstein, L. E., Lu, Z.-L., & Jiang, J. (2008). Quantified acoustic–optical speech signal incongruity identifies cortical sites of audiovisual speech processing. Brain Research, 1242, 172–84.Google Scholar
Bertelson, P. (1998). Starting from the ventriloquist: The perception of multimodal events. In Advances in psychological science, Vol. 2: Biological and cognitive aspects (pp. 419–39). Hove, England: Psychology Press/Erlbaum (UK): Taylor & Francis.Google Scholar
Bertelson, P., & Aschersleben, G. (1998). Automatic visual bias of perceived auditory location. Psychonomic Bulletin & Review, 5(3), 482–9.Google Scholar
Bertelson, P., Vroomen, J., De Gelder, B., & Driver, J. (2000). The ventriloquist effect does not depend on the direction of deliberate visual attention. Perception & Psychophysics, 62(2), 321–32. https://doi.org/10.3758/BF03205552CrossRefGoogle Scholar
Biau, E. (2015). Beat gestures and speech processing: when prosody extends to the speaker’s hands. Universitat Pompeu Fabra.Google Scholar
Biau, E., Fromont, L. A., & Soto-Faraco, S. (2018). Beat Gestures and Syntactic Parsing: an ERP Study. Language Learning, 68, 102–26. https://doi.org/10.1111/lang.12257Google Scholar
Biau, E., Morís Fernández, L., Holle, H., Avila, C., & Soto-Faraco, S. (2016). Hand gestures as visual prosody: BOLD responses to audio-visual alignment are modulated by the communicative nature of the stimuli. NeuroImage, 132. https://doi.org/10.1016/j.neuroimage.2016.02.018Google Scholar
Biau, E., & Soto-Faraco, S. (2013). Beat gestures modulate auditory integration in speech perception. Brain and Language, 124(2). https://doi.org/10.1016/j.bandl.2012.10.008Google Scholar
Biau, E., & Soto-Faraco, S. (2015). Synchronization by the hand: the sight of gestures modulates low-frequency activity in brain responses to continuous speech. Frontiers in Human Neuroscience, 9(September). https://doi.org/10.3389/fnhum.2015.00527Google Scholar
Biau, E., Torralba, M., Fuentemilla, L., de Diego Balaguer, R., & Soto-Faraco, S. (2015). Speaker’s hand gestures modulate speech perception through phase resetting of ongoing neural oscillations. Cortex, 68. https://doi.org/10.1016/j.cortex.2014.11.018Google Scholar
Birmingham, E., & Kingstone, A. (2009). Human social attention. Annals of the New York Academy of Sciences, 1156(1), 118–40. https://doi.org/10.1111/j.1749–6632.2009.04468.xCrossRefGoogle ScholarPubMed
Bizley, J. K., Maddox, R. K., & Lee, A. K. C. (2016). Defining auditory-visual objects: behavioral tests and physiological mechanisms. Trends in Neurosciences, 39(2), 7485. https://doi.org/10.1016/J.TINS.2015.12.007Google Scholar
Blanton, H., & Jaccard, J. (2006). Arbitrary metrics in psychology. American Psychologist, 61(1), 2741. https://doi.org/10.1037/0003-066X.61.1.27CrossRefGoogle ScholarPubMed
Bolognini, N., Frassinetti, F., Serino, A., & Làdavas, E. (2005). ‘Acoustical vision’ of below threshold stimuli: interaction among spatially converging audiovisual inputs. Experimental Brain Research, 160(3), 273–82. https://doi.org/10.1007/s00221-004–2005–zGoogle Scholar
Bordier, C., Puja, F., & Macaluso, E. (2013). Sensory processing during viewing of cinematographic material: computational modeling and functional neuroimaging. NeuroImage, 67, 213–26. https://doi.org/10.1016/J.NEUROIMAGE.2012.11.031CrossRefGoogle ScholarPubMed
Botvinick, M. M. (2007). Conflict monitoring and decision making: reconciling two perspectives on anterior cingulate function. Cognitive, Affective, & Behavioral Neuroscience, 7(4), 356–66.Google Scholar
Botvinick, M. M., Braver, T. S., Barch, D. M., Carter, C. S., & Cohen, J. D. (2001). Conflict monitoring and cognitive control. Psychological Review, 108(3), 624–52.Google Scholar
Botvinick, M. M., & Cohen, J. D. (1998). Rubber hands ‘feel’ touch that eyes see. Nature, 391(6669), 756.Google Scholar
Botvinick, M. M., Cohen, J. D., & Carter, C. S. (2004). Conflict monitoring and anterior cingulate cortex: an update. Trends in Cognitive Sciences, 8(12), 539–46. https://doi.org/10.1016/J.TICS.2004.10.003Google Scholar
Bronkhorst, A. W. (2000). The Cocktail Party phenomenon: a review of research on speech intelligibility in multiple-talker conditions. Acta Acustica, 86(1), 117–28.Google Scholar
Burgess, P. W., Alderman, N., Forbes, C., Costello, A., Coates, L. M-A., Dawson, D. R., … Channon, S. (2006). The case for the development and use of ‘ecologically valid’ measures of executive function in experimental and clinical neuropsychology. Journal of the International Neuropsychological Society, 12(02), 194209. https://doi.org/10.1017/S1355617706060310Google Scholar
Busse, L., Roberts, K. C., Crist, R. E., Weissman, D. H., & Woldorff, M. G. (2005). The spread of attention across modalities and space in a multisensory object. Proceedings of the National Academy of Sciences of the United States of America, 102(51), 18751–18756.Google Scholar
Buzsáki, G. (2006). Rhythms of the brain. Oxford University Press.Google Scholar
Caclin, A., Bouchet, P., Djoulah, F., Pirat, E., Pernier, J., & Giard, M.-H. (2011). Auditory enhancement of visual perception at threshold depends on visual abilities. Brain Research, 1396, 3544. https://doi.org/10.1016/J.BRAINRES.2011.04.016Google Scholar
Calvert, G., Spence, C., & Stein, B. E. (2004). The handbook of multisensory processes. Massachusetts Institute of Technology Press.CrossRefGoogle Scholar
Castro, L., Soto-Faraco, S., Morís Fernández, L., & Ruzzoli, M. (2018). The breakdown of the Simon effect in cross-modal contexts: EEG evidence. European Journal of Neuroscience, 47(7). https://doi.org/10.1111/ejn.13882Google Scholar
Cavallina, C., Puccio, G., Capurso, M., Bremner, A. J., & Santangelo, V. (2018). Cognitive development attenuates audiovisual distraction and promotes the selection of task-relevant perceptual saliency during visual search on complex scenes. Cognition, 180, 91–8. https://doi.org/10.1016/J.COGNITION.2018.07.003CrossRefGoogle ScholarPubMed
Cecere, R., Rees, G., & Romei, V. (2015). Individual differences in alpha frequency drive crossmodal illusory perception. Current Biology, 25(2), 231–5. https://doi.org/10.1016/J.CUB.2014.11.034Google Scholar
Chandrasekaran, C., Trubanova, A., Stillittano, S., Caplier, A., & Ghazanfar, A. A. (2009). The natural statistics of audiovisual speech. PLoS Computational Biology, 5(7), e1000436. https://doi.org/10.1371/journal.pcbi.1000436Google Scholar
Chen, Y.-C., & Spence, C. (2010). When hearing the bark helps to identify the dog: semantically-congruent sounds modulate the identification of masked pictures. Cognition, 114(3), 389404. https://doi.org/10.1016/J.COGNITION.2009.10.012Google Scholar
Chen, Y.-C., & Spence, C. (2011). Crossmodal semantic priming by naturalistic sounds and spoken words enhances visual sensitivity. Journal of Experimental Psychology: Human Perception and Performance, 37(5), 1554–68. https://doi.org/10.1037/a0024329Google Scholar
Chen, Y.-C., & Spence, C. (2013). The time-course of the cross-modal semantic modulation of visual picture processing by naturalistic sounds and spoken words. Multisensory Research, 26(4), 371–86. https://doi.org/10.1163/22134808–00002420Google Scholar
Chen, Y.-C., & Spence, C. (2017). Assessing the role of the ‘unity assumption’ on multisensory integration: a review. Frontiers in Psychology, 8, 445. https://doi.org/10.3389/fpsyg.2017.00445Google Scholar
Chen, Y.-C., & Spence, C. (2018a). Audiovisual semantic interactions between linguistic and nonlinguistic stimuli: the time-courses and categorical specificity. Journal of Experimental Psychology: Human Perception and Performance, 44(10), 1488–507. https://doi.org/10.1037/xhp0000545Google Scholar
Chen, Y.-C., & Spence, C. (2018b). Dissociating the time courses of the cross-modal semantic priming effects elicited by naturalistic sounds and spoken words. Psychonomic Bulletin & Review, 25(3), 1138–46. https://doi.org/10.3758/s13423-017–1324–6Google Scholar
Cherry, E. C. (1953). Some experiments on the recognition of speech, with one and with two ears. The Journal of the Acoustical Society of America, 25(5), 975–9. https://doi.org/10.1121/1.1907229Google Scholar
Churchland, P. S., Ramachandran, V. S., & Sejnowski, T. J. (2005). A critique of pure vision. In Koch, C. and Davis, J. (Eds.), Large-Scale Neuronal Theories of the Brain, 125. https://doi.org/10.1207/S15326969ECO1502_5Google Scholar
Clayton, M. S., Yeung, N., & Cohen Kadosh, R. (2015). The roles of cortical oscillations in sustained attention. Trends in Cognitive Sciences, 19(4), 188–95. https://doi.org/10.1016/J.TICS.2015.02.004Google Scholar
Colonius, H., & Arndt, P. (2001). A two-stage model for visual-auditory interaction in saccadic latencies. Perception & Psychophysics, 63(1), 126–47. https://doi.org/10.3758/BF03200508CrossRefGoogle ScholarPubMed
Colonius, H., & Diederich, A. (2004). Multisensory interaction in saccadic reaction time: a time-window-of-integration model. Journal of Cognitive Neuroscience, 16(6), 1000–9. https://doi.org/10.1162/0898929041502733Google Scholar
Connor, S. (2000). Dumbstruck: a cultural history of ventriloquism. Oxford University Press. https://doi.org/10.1093/acprof:oso/9780198184331.001.0001Google Scholar
Corneil, B. D., Van Wanrooij, M., Munoz, D. P., & Van Opstal, A. J. (2002). Auditory-visual interactions subserving goal-directed saccades in a complex scene. Journal of Neurophysiology, 88(1), 438–54. https://doi.org/10.1152/jn.2002.88.1.438Google Scholar
Dalton, P., Doolittle, N., Nagata, H., & Breslin, P. A. S. (2000). The merging of the senses: integration of subthreshold taste and smell. Nature Neuroscience, 3(5), 431–2. https://doi.org/10.1038/74797Google Scholar
Danilov, Y., & Tyler, M. E. (2005). Brainport: an alternative input to the brain. Journal of Integrative Neuroscience, 04(04), 537–50. https://doi.org/10.1142/S0219635205000914CrossRefGoogle Scholar
De Gelder, B., & Bertelson, P. (2003). Multisensory integration, perception and ecological validity. Trends in Cognitive Sciences, 7(10), 460–7. https://doi.org/10.1016/J.TICS.2003.08.014Google Scholar
De Meo, R., Murray, M. M., Clarke, S., & Matusz, P. J. (2015). Top-down control and early multisensory processes: chicken vs. egg. Frontiers in Integrative Neuroscience, 9, 17. https://doi.org/10.3389/fnint.2015.00017Google Scholar
Desantis, A., & Haggard, P. (2016). Action-outcome learning and prediction shape the window of simultaneity of audiovisual outcomes. Cognition, 153, 33–42. https://doi.org/10.1016/j.cognition.2016.03.009Google Scholar
Desimone, R., & Duncan, J. (1995). Neural mechanisms of selective visual attention. Annual Review of Neuroscience, 18(1), 193222.Google Scholar
Diederich, A., & Colonius, H. (2004). Bimodal and trimodal multisensory enhancement: Effects of stimulus onset and intensity on reaction time. Perception & Psychophysics, 66(8), 1388–404. https://doi.org/10.3758/BF03195006Google Scholar
Dimitrova, D., Chu, M., Wang, L., Özyürek, A., & Hagoort, P. (2016). Beat that word: how listeners integrate beat gesture and focus in multimodal speech discourse. Journal of Cognitive Neuroscience, 28(9), 1255–69. https://doi.org/10.1162/jocn_a_00963Google Scholar
Doehrmann, O., & Naumer, M. J. (2008). Semantics and the multisensory brain: How meaning modulates processes of audio-visual integration. Brain Research, 1242, 136–50. https://doi.org/10.1016/J.BRAINRES.2008.03.071Google Scholar
Donohue, S. E., Green, J. J., & Woldorff, M. G. (2015). The effects of attention on the temporal integration of multisensory stimuli. Frontiers in Integrative Neuroscience, 9, 32. https://doi.org/10.3389/fnint.2015.00032Google Scholar
Driver, J. (1996). Enhancement of selective listening by illusory mislocation of speech sounds due to lip-reading. Nature, 381(6577), 66–8. https://doi.org/10.1038/381066a0Google Scholar
Driver, J., & Noesselt, T. (2008). Multisensory interplay reveals crossmodal influences on ‘sensory-specific’ brain regions, neural responses, and judgments. Neuron, 57(1), 1123. https://doi.org/10.1016/J.NEURON.2007.12.013Google Scholar
Driver, J., & Spence, C. (1994). Spatial synergies between auditory and visual attention. In Umilta, C and Moscovitch, M (eds.), Attention and Performance 15 (pp. 311–31). Cambridge, MA: Massachusetts Institute of Technology Press.Google Scholar
Driver, J., & Spence, C. (1998). Attention and the crossmodal construction of space. Trends in Cognitive Sciences, 2(7), 254–62. https://doi.org/10.1016/S1364–6613(98)01188–7Google Scholar
Driver, J., & Spence, C. (2000). Multisensory perception: beyond modularity and convergence. Current Biology, 10(20), R731R735. https://doi.org/10.1016/S0960–9822(00)00740–5Google Scholar
Duncan, J., Martens, S., & Ward, R. (1997). Restricted attentional capacity within but not between sensory modalities. Nature, 387(6635), 808–10. https://doi.org/10.1038/42947Google Scholar
Durie, B. (2005). Doors of perception. New Scientist, 185(2484), 34–6.Google Scholar
Enns, J. T., & Lleras, A. (2008). What’s next? New evidence for prediction in human vision. Trends in Cognitive Sciences, 12(9), 327–33. https://doi.org/10.1016/J.TICS.2008.06.001Google Scholar
Eriksen, B. A., & Eriksen, C. W. (1974). Effects of noise letters upon the identification of a target letter in a nonsearch task. Perception & Psychophysics, 16(1), 143–9. https://doi.org/10.3758/BF03203267Google Scholar
Ernst, M. O., & Banks, M. S. (2002). Humans integrate visual and haptic information in a statistically optimal fashion. Nature, 415(6870), 429–33. https://doi.org/10.1038/415429aGoogle Scholar
Fairhall, S. L., & Macaluso, E. (2009). Spatial attention can modulate audiovisual integration at multiple cortical and subcortical sites. European Journal of Neuroscience, 29(6), 1247–57. https://doi.org/10.1111/j.1460–9568.2009.06688.xCrossRefGoogle ScholarPubMed
Falchier, A., Clavagnier, S., Barone, P., & Kennedy, H. (2002). Anatomical evidence of multimodal integration in primate striate cortex. The Journal of Neuroscience, 22(13), 5749 LP5759.Google Scholar
Fetsch, C. R., Pouget, A., DeAngelis, G. C., & Angelaki, D. E. (2012). Neural correlates of reliability-based cue weighting during multisensory integration. Nature Neuroscience, 15(1), 146–54. https://doi.org/10.1038/nn.2983Google Scholar
Fiebelkorn, I. C., Foxe, J. J., Butler, J. S., Mercier, M. R., Snyder, A. C., & Molholm, S. (2011). Ready, set, reset: stimulus-locked periodicity in behavioral performance demonstrates the consequences of cross-sensory phase reset. The Journal of Neuroscience, 31(27),9971 LP9981.Google Scholar
Fiebelkorn, I. C., Foxe, J. J., Schwartz, T. H., & Molholm, S. (2010). Staying within the lines: the formation of visuospatial boundaries influences multisensory feature integration. European Journal of Neuroscience, 31(10), 1737–43. https://doi.org/10.1111/j.1460–9568.2010.07196.xGoogle Scholar
Folk, C. L., Remington, R. W., & Johnston, J. C. (1992). Involuntary covert orienting is contingent on attentional control settings. Journal of Experimental Psychology. Human Perception and Performance, 18(4), 1030–44. https://doi.org/10.1037//0096–1523.18.4.1015Google Scholar
Foxe, J. J., Morocz, I. A., Murray, M. M., Higgins, B. A., Javitt, D. C., & Schroeder, C. E. (2000). Multisensory auditory–somatosensory interactions in early cortical processing revealed by high-density electrical mapping. Cognitive Brain Research, 10(1–2), 7783. https://doi.org/10.1016/S0926–6410(00)00024–0Google Scholar
Foxe, J. J., & Schroeder, C. E. (2005). The case for feedforward multisensory convergence during early cortical processing. Neuroreport, 16(5), 419–23.Google Scholar
Frassinetti, F., Bolognini, N., & Làdavas, E. (2002). Enhancement of visual perception by crossmodal visuo-auditory interaction. Experimental Brain Research, 147(3), 332–43. https://doi.org/10.1007/s00221-002–1262–yGoogle Scholar
Friston, K. (2005). A theory of cortical responses. Philosophical Transactions of the Royal Society B: Biological Sciences, 360(1456), 815836.Google Scholar
Friston, K. (2010). The free-energy principle: a unified brain theory? Nature Reviews Neuroscience, 11(2), 127–38. https://doi.org/10.1038/nrn2787Google Scholar
Fujisaki, W., Koene, A., Arnold, D. H., Johnston, A., & Nishida, S. (2006). Visual search for a target changing in synchrony with an auditory signal. Proceedings. Biological Sciences, 273(1588), 865–74. https://doi.org/10.1098/rspb.2005.3327Google Scholar
Fujisaki, W., & Nishida, S. (2010). A common perceptual temporal limit of binding synchronous inputs across different sensory attributes and modalities. Proceedings of the Royal Society B: Biological Sciences. http://doi.org/10.1098/rspb.2010.0243Google Scholar
Fujisaki, W., Shimojo, S., Kashino, M., & Nishida, S. (2004). Recalibration of audiovisual simultaneity. Nature Neuroscience, 7(7), 773–8. https://doi.org/10.1038/nn1268CrossRefGoogle ScholarPubMed
Gallivan, J. P., Chapman, C. S., Wolpert, D. M., & Flanagan, J. R. (2018). Decision-making in sensorimotor control. Nature Reviews Neuroscience, 19(9), 519–34. https://doi.org/10.1038/s41583-018–0045–9Google Scholar
Gau, R., & Noppeney, U. (2016). How prior expectations shape multisensory perception. NeuroImage, 124, 876–86. https://doi.org/10.1016/J.NEUROIMAGE.2015.09.045Google Scholar
Ghazanfar, A. A., & Schroeder, C. E. (2006). Is neocortex essentially multisensory? Trends in Cognitive Sciences, 10(6), 278–85. https://doi.org/10.1016/J.TICS.2006.04.008Google Scholar
Gleiss, S., & Kayser, C. (2013). Eccentricity dependent auditory enhancement of visual stimulus detection but not discrimination. Frontiers in Integrative Neuroscience, 7, 52. https://doi.org/10.3389/fnint.2013.00052Google Scholar
Gleiss, S., & Kayser, C. (2014). Acoustic noise improves visual perception and modulates occipital oscillatory states. Journal of Cognitive Neuroscience, 26(4), 699711. https://doi.org/10.1162/jocn_a_00524Google Scholar
Grabot, L., Kösem, A., Azizi, L., & van Wassenhove, V. (2017). Prestimulus alpha oscillations and the temporal sequencing of audiovisual events. Journal of Cognitive Neuroscience, 29(9), 1566–82. https://doi.org/10.1162/jocn_a_01145Google Scholar
Grant, K. W., & Seitz, P.-F. (2000). The use of visible speech cues for improving auditory detection of spoken sentences. The Journal of the Acoustical Society of America, 108(3), 1197. https://doi.org/10.1121/1.1288668Google Scholar
Hartcher-O’Brien, J., Soto-Faraco, S., & Adam, R. (2017). Editorial: a matter of bottom-up or top-down processes: the role of attention in multisensory integration. Frontiers in Integrative Neuroscience, 11. https://doi.org/10.3389/fnint.2017.00005Google Scholar
Hartcher-O’Brien, J., Talsma, D., Adam, R., Vercillo, T., Macaluso, E., & Noppeney, U. (2016). The curious incident of attention in multisensory integration: bottom-up vs. top-down. Multisensory Research, 29(6–7), 557–83. https://doi.org/10.1163/22134808–00002528Google Scholar
Hasbroucq, T., & Guiard, Y. (1991). Stimulus-response compatibility and the Simon effect: toward a conceptual clarification. Journal of Experimental Psychology: Human Perception and Performance, 17(1), 246–66.Google Scholar
Hasson, U., Malach, R., & Heeger, D. J. (2010). Reliability of cortical activity during natural stimulation. Trends in Cognitive Sciences, 14(1), 40–8. https://doi.org/10.1016/J.TICS.2009.10.011Google Scholar
Hasson, U., Nir, Y., Levy, I., Fuhrmann, G., & Malach, R. (2004). Intersubject synchronization of cortical activity during natural vision. Science, 303(5664), 1634 LP1640.Google Scholar
Hasson, U., Skipper, J. I., Nusbaum, H. C., & Small, S. L. (2007). Abstract coding of audiovisual speech: beyond sensory representation. Neuron, 56(6), 1116–26.Google Scholar
Heed, T., Buchholz, V. N., Engel, A. K., & Röder, B. (2015). Tactile remapping: from coordinate transformation to integration in sensorimotor processing. Trends in Cognitive Sciences, 19(5), 251–8. https://doi.org/10.1016/J.TICS.2015.03.001Google Scholar
Heron, J., Roach, N. W., Hanson, J. V. M., McGraw, P. V, & Whitaker, D. (2012). Audiovisual time perception is spatially specific. Experimental Brain Research, 218(3), 477–85. https://doi.org/10.1007/s00221-012–3038–3Google Scholar
Hipp, J. F., Engel, A. K., & Siegel, M. (2011). Oscillatory synchronization in large-scale cortical networks predicts perception. Neuron, 69(2), 387–96. https://doi.org/10.1016/J.NEURON.2010.12.027Google Scholar
Ho, C., Reed, N., & Spence, C. (2006). Assessing the effectiveness of ‘intuitive’ vibrotactile warning signals in preventing front-to-rear-end collisions in a driving simulator. Accident Analysis & Prevention, 38(5), 988–96. https://doi.org/10.1016/J.AAP.2006.04.002Google Scholar
Ho, C., Reed, N., & Spence, C. (2007). Multisensory in-car warning signals for collision avoidance. Human Factors: The Journal of the Human Factors and Ergonomics Society, 49(6), 1107–14. https://doi.org/10.1518/001872007X249965CrossRefGoogle ScholarPubMed
Ho, C., & Spence, C. (2014). Effectively responding to tactile stimulation: Do homologous cue and effector locations really matter? Acta Psychologica, 151, 32–9. https://doi.org/10.1016/J.ACTPSY.2014.05.014Google Scholar
Ho, C., Tan, H. Z., & Spence, C. (2005). Using spatial vibrotactile cues to direct visual attention in driving scenes. Transportation Research Part F: Traffic Psychology and Behaviour, 8(6), 397412. https://doi.org/10.1016/J.TRF.2005.05.002Google Scholar
Holle, H., Obermeier, C., Schmidt-Kassow, M., Friederici, A. D., Ward, J., & Gunter, T. C. (2012). Gesture facilitates the syntactic analysis of speech. Frontiers in Psychology, 3, 74. https://doi.org/10.3389/fpsyg.2012.00074CrossRefGoogle ScholarPubMed
Hommel, B. (1993). The role of attention for the Simon effect. Psychological Research, 55(3), 208–22.Google Scholar
Hommel, B. (2011). The Simon effect as tool and heuristic. Acta Psychologica, 136(2), 189202.Google Scholar
Huang, L., Treisman, A., & Pashler, H. (2007). Characterizing the limits of human visual awareness. Science, 317(5839), 823825.Google Scholar
Hubbard, A. L., Wilson, S. M., Callan, D. E., & Dapretto, M. (2009). Giving speech a hand: gesture modulates activity in auditory cortex during speech perception. Human Brain Mapping, 30(3), 1028–37. https://doi.org/10.1002/hbm.20565Google Scholar
Igualada, A., Esteve-Gibert, N., & Prieto, P. (2017). Beat gestures improve word recall in 3- to 5-year-old children. Journal of Experimental Child Psychology, 156, 99112. https://doi.org/10.1016/J.JECP.2016.11.017Google Scholar
Ikumi, N., & Soto-Faraco, S. (2014). Selective attention modulates the direction of audio-visual temporal recalibration. PLoS ONE, 9(7). https://doi.org/10.1371/journal.pone.0099311Google Scholar
Ikumi, N., & Soto-Faraco, S. (2017). Grouping and segregation of sensory events by actions in temporal audio-visual recalibration. Frontiers in Integrative Neuroscience, 10. https://doi.org/10.3389/fnint.2016.00044Google Scholar
Ikumi, N., Torralba, M., Ruzzoli, M., & Soto-Faraco, S. (2019). The phase of pre-stimulus brain oscillations correlates with cross-modal synchrony perception. European Journal of Neuroscience, 49(2), 150–64. https://doi.org/10.1111/ejn.14186Google Scholar
Iordanescu, L., Grabowecky, M., Franconeri, S., Theeuwes, J., & Suzuki, S. (2010). Characteristic sounds make you look at target objects more quickly. Attention, Perception & Psychophysics, 72(7), 1736–41. https://doi.org/10.3758/APP.72.7.1736Google Scholar
Iordanescu, L., Grabowecky, M., & Suzuki, S. (2011). Object-based auditory facilitation of visual search for pictures and words with frequent and rare targets. Acta Psychologica, 137(2), 252–9. https://doi.org/10.1016/J.ACTPSY.2010.07.017Google Scholar
Iordanescu, L., Guzman-Martinez, E., Grabowecky, M., & Suzuki, S. (2008). Characteristic sounds facilitate visual search. Psychonomic Bulletin & Review, 15(3), 548–54. https://doi.org/10.3758/PBR.15.3.548Google Scholar
Jack, B. N., O’Shea, R. P., Cottrell, D., & Ritter, W. (2013). Does the ventriloquist illusion assist selective listening? Journal of Experimental Psychology: Human Perception and Performance, 39(5), 1496–502. https://doi.org/10.1037/a0033594Google ScholarPubMed
Jaekl, P. M., & Harris, L. R. (2009). Sounds can affect visual perception mediated primarily by the parvocellular pathway. Visual Neuroscience, 26(5–6), 477–86. https://doi.org/10.1017/S0952523809990289Google Scholar
Jaekl, P. M., Pérez-Bellido, A., & Soto-Faraco, S. (2014). On the ‘visual’ in ‘audio-visual integration’: a hypothesis concerning visual pathways. Experimental Brain Research, 232(6), 1631–8. https://doi.org/10.1007/s00221-014–3927–8Google Scholar
Jaekl, P. M., Pesquita, A., Sinnett, S., Alsius, A., Munhall, K., & Soto-Faraco, S. (2015). The contribution of dynamic visual cues to audiovisual speech perception. Neuropsychologia, 75, 402–10.CrossRefGoogle ScholarPubMed
Jaekl, P. M., & Soto-Faraco, S. (2010). Audiovisual contrast enhancement is articulated primarily via the M-pathway. Brain Research, 1366, 8592. https://doi.org/10.1016/j.brainres.2010.10.012Google Scholar
Jiang, J., Zhang, Q., & Van Gaal, S. (2015). EEG neural oscillatory dynamics reveal semantic and response conflict at difference levels of conflict awareness. Scientific Reports, 5, 12008.Google Scholar
Kayser, C., Körding, K. P., & König, P. (2004). Processing of complex stimuli and natural scenes in the visual cortex. Current Opinion in Neurobiology, 14(4), 468–73. https://doi.org/10.1016/J.CONB.2004.06.002Google Scholar
Kayser, C., & Shams, L. (2015). Multisensory causal inference in the brain. PLOS Biology, 13(2), e1002075. https://doi.org/10.1371/journal.pbio.1002075Google Scholar
Keil, J., Müller, N., Hartmann, T., & Weisz, N. (2014). Prestimulus beta power and phase synchrony influence the sound-induced flash illusion. Cerebral Cortex, 24(5), 1278–88.Google Scholar
Keil, J., & Senkowski, D. (2017). Individual alpha frequency relates to the sound-induced flash illusion. Multisensory Research, 30(6), 565–78. https://doi.org/10.1163/22134808–00002572Google Scholar
King, A. J. (2005). Multisensory integration: strategies for synchronization. Current Biology, 15(9), R339R341. https://doi.org/10.1016/J.CUB.2005.04.022Google Scholar
Kingstone, A., Smilek, D., Ristic, J., Kelland Friesen, C., & Eastwood, J. D. (2003). Attention, researchers! It is time to take a look at the real world. Current Directions in Psychological Science, 12(5), 176–80. https://doi.org/10.1111/1467–8721.01255Google Scholar
Kitano, H. (2002). Systems biology: a brief overview. Science (New York, N.Y.), 295(5560), 1662–4. https://doi.org/10.1126/science.1069492Google Scholar
Knoeferle, K. M., Knoeferle, P., Velasco, C., & Spence, C. (2016). Multisensory brand search: how the meaning of sounds guides consumers’ visual attention. Journal of Experimental Psychology: Applied, 22(2), 196210. https://doi.org/10.1037/xap0000084Google Scholar
Koelewijn, T., Bronkhorst, A., & Theeuwes, J. (2010). Attention and the multiple stages of multisensory integration: a review of audiovisual studies. Acta Psychologica, 134(3), 372–84. https://doi.org/10.1016/J.ACTPSY.2010.03.010Google Scholar
Kösem, A., Gramfort, A., & van Wassenhove, V. (2014). Encoding of event timing in the phase of neural oscillations. NeuroImage, 92, 274–84. https://doi.org/10.1016/J.NEUROIMAGE.2014.02.010Google Scholar
Kösem, A., & van Wassenhove, V. (2012). Temporal structure in audiovisual sensory selection. PLoS ONE, 7(7), e40936. https://doi.org/10.1371/journal.pone.0040936Google Scholar
Kvasova, D., Garcia-Vernet, L., & Soto-Faraco, S. (2019). Characteristic sounds facilitate object search in real-life scenes. bioRxiv, 563080. doi:https://doi.org/10.1101/563080Google Scholar
Ladavas, E., & Moscovitch, M. (1984). Must egocentric and environmental frames of reference be aligned to produce spatial S-R compatibility effects? Journal of Experimental Psychology: Human Perception and Performance. American Psychological Association. https://doi.org/10.1037/0096–1523.10.2.205Google Scholar
Lakatos, P., Chen, C.-M., O’Connell, M. N., Mills, A., & Schroeder, C. E. (2007). Neuronal oscillations and multisensory interaction in primary auditory cortex. Neuron, 53(2), 279–92. https://doi.org/10.1016/J.NEURON.2006.12.011CrossRefGoogle ScholarPubMed
Lakatos, P., Karmos, G., Mehta, A. D., Ulbert, I., & Schroeder, C. E. (2008). Entrainment of neuronal oscillations as a mechanism of attentional selection. Science, 320(5872), 110113.Google Scholar
Lamberts, K., Tavernier, G., & d’Ydewalle, G. (1992). Effects of multiple reference points in spatial stimulus-response compatibility. Acta Psychologica, 79(2), 115–30.Google Scholar
Laurienti, P., Kraft, R., Maldjian, J., Burdette, J., & Wallace, M. (2004). Semantic congruence is a critical factor in multisensory behavioral performance. Experimental Brain Research, 158(4), 405–14. https://doi.org/10.1007/s00221-004–1913–2Google Scholar
Leone, L. M., & McCourt, M. E. (2013). The roles of physical and physiological simultaneity in audiovisual multisensory facilitation. I-Perception, 4(4), 213–28. https://doi.org/10.1068/i0532Google Scholar
Lewald, J., & Guski, R. (2003). Cross-modal perceptual integration of spatially and temporally disparate auditory and visual stimuli. Cognitive Brain Research, 16(3), 468–78. https://doi.org/10.1016/S0926–6410(03)00074–0Google Scholar
Lippert, M., Logothetis, N. K., & Kayser, C. (2007). Improvement of visual contrast detection by a simultaneous sound. Brain Research, 1173, 102–9. https://doi.org/10.1016/J.BRAINRES.2007.07.050Google Scholar
Lunghi, C., & Alais, D. (2013). Touch interacts with vision during binocular rivalry with a tight orientation tuning. PLoS ONE, 8(3), e58754. https://doi.org/10.1371/journal.pone.0058754Google Scholar
Lunn, J., Sjoblom, A., Ward, J., Soto-Faraco, S., & Forster, S. (2019). Multisensory enhancement of attention depends on whether you are already paying attention. Cognition, 187, 3849. https://doi.org/10.1016/J.COGNITION.2019.02.008Google Scholar
Macaluso, E., & Doricchi, F. (2013). Attention and predictions: control of spatial attention beyond the endogenous-exogenous dichotomy. Frontiers in Human Neuroscience, 7, 685. https://doi.org/10.3389/fnhum.2013.00685Google Scholar
Macaluso, E., & Driver, J. (2005). Multisensory spatial interactions: a window onto functional integration in the human brain. Trends in Neurosciences, 28(5), 264–71. https://doi.org/10.1016/J.TINS.2005.03.008Google Scholar
Macaluso, E., Frith, C. D., & Driver, J. (2000). Modulation of human visual cortex by crossmodal spatial attention. Science (New York, N.Y.), 289(5482), 1206–8. https://doi.org/10.1126/science.289.5482.1206Google Scholar
MacLeod, C. M. (1991). Half a century of research on the Stroop effect: An integrative review. Psychological Bulletin, 109(2), 163203. https://doi.org/10.1037/0033–2909.109.2.163Google Scholar
Maddox, R. K., Atilgan, H., Bizley, J. K., & Lee, A. K. (2015). Auditory selective attention is enhanced by a task-irrelevant temporally coherent visual stimulus in human listeners. ELife, 4, 4995.CrossRefGoogle ScholarPubMed
Maguire, E. A. (2012). Studying the freely-behaving brain with fMRI. NeuroImage, 62(2), 1170–6. https://doi.org/10.1016/J.NEUROIMAGE.2012.01.009Google Scholar
Maidenbaum, S., & Abboud, S. (2014). Sensory substitution: closing the gap between basic research and widespread practical visual rehabilitation. Neuroscience & Biobehavioral Reviews, 41, 315. https://doi.org/10.1016/J.NEUBIOREV.2013.11.007Google Scholar
Malfait, N., Fonlupt, P., Centelles, L., Nazarian, B., Brown, L. E., & Caclin, A. (2014). Different neural networks are involved in audiovisual speech perception depending on the context. Journal of Cognitive Neuroscience, 26(7), 1572–86.Google Scholar
Martolini, C., Cuppone, A. V., Cappagli, G., Finocchietti, S., Maviglia, A., & Gori, M. (2018). ABBI-K: a novel tool for evaluating spatial and motor abilities in visually impaired children. In 2018 IEEE International Symposium on Medical Measurements and Applications (MeMeA) (pp. 1–6). IEEE. https://doi.org/10.1109/MeMeA.2018.8438671Google Scholar
Mast, F., Frings, C., & Spence, C. (2017). Crossmodal attentional control sets between vision and audition. Acta Psychologica, 178, 41–7. https://doi.org/10.1016/J.ACTPSY.2017.05.011Google Scholar
Mastroberardino, S., Santangelo, V., & Macaluso, E. (2015). Crossmodal semantic congruence can affect visuo-spatial processing and activity of the fronto-parietal attention networks. Frontiers in Integrative Neuroscience, 9, 45. https://doi.org/10.3389/fnint.2015.00045Google Scholar
Matchin, W., Groulx, K., & Hickok, G. (2014). Audiovisual speech integration does not rely on the motor system: evidence from articulatory suppression, the McGurk effect, and fMRI. Journal of Cognitive Neuroscience, 26(3), 606–20.Google Scholar
Matusz, P. J., Broadbent, H., Ferrari, J., Forrest, B., Merkley, R., & Scerif, G. (2015). Multi-modal distraction: insights from children’s limited attention. Cognition, 136, 156–65. https://doi.org/10.1016/J.COGNITION.2014.11.031Google Scholar
Matusz, P. J., Dikker, S., Huth, A. G., & Perrodin, C. (2018). Are we ready for real-world neuroscience? Journal of Cognitive Neuroscience, 112. https://doi.org/10.1162/jocn_e_01276Google Scholar
Matusz, P. J., & Eimer, M. (2011). Multisensory enhancement of attentional capture in visual search. Psychonomic Bulletin & Review, 18(5), 904–9. https://doi.org/10.3758/s13423-011–0131–8Google Scholar
Matusz, P. J., Turoman, N., Tivadar, R. I., Retsa, C., & Murray, M. M. (2019). Brain and cognitive mechanisms of top–down attentional control in a multisensory world: benefits of electrical neuroimaging. Journal of Cognitive Neuroscience, 31(3), 412–30. https://doi.org/10.1162/jocn_a_01360Google Scholar
McDonald, J. J., Teder-Sälejärvi, W. A., & Ward, L. M. (2001). Multisensory integration and crossmodal attention effects in the human brain. Science, 292(5523), 1791.Google Scholar
McGurk, H., & MacDonald, J. (1976). Hearing lips and seeing voices. Nature, 264(5588), 746.Google Scholar
McNeill, D. (1992). Hand and mind: what gestures reveal about thought. University of Chicago Press.Google Scholar
Medina, J., McCloskey, M., Coslett, H., & Rapp, B. (2014). Somatotopic representation of location: evidence from the Simon effect. Journal of Experimental Psychology: Human Perception and Performance, 40(6), 2131–42.Google Scholar
Miller, J. (1982). Divided attention: evidence for coactivation with redundant signals. Cognitive Psychology, 14(2), 247–79. https://doi.org/10.1016/0010–0285(82)90010–XGoogle Scholar
Miller, J. (1986). Timecourse of coactivation in bimodal divided attention. Perception & Psychophysics, 40(5), 331–43. https://doi.org/10.3758/BF03203025Google Scholar
Miller, L. M., & D’esposito, M. (2005). Perceptual fusion and stimulus coincidence in the cross-modal integration of speech. Journal of Neuroscience, 25(25), 5884–93.Google Scholar
Milton, A., & Pleydell-Pearce, C. W. (2016). The phase of pre-stimulus alpha oscillations influences the visual perception of stimulus timing. NeuroImage, 133, 5361. https://doi.org/10.1016/j.neuroimage.2016.02.065Google Scholar
Molholm, S., Ritter, W., Javitt, D. C., & Foxe, J. J. (2004). Multisensory visual–auditory object recognition in humans: a high-density electrical mapping study. Cerebral Cortex, 14(4), 452–65.Google Scholar
Molholm, S., Ritter, W., Murray, M. M., Javitt, D. C., Schroeder, C. E., & Foxe, J. J. (2002). Multisensory auditory–visual interactions during early sensory processing in humans: a high-density electrical mapping study. Cognitive Brain Research, 14(1), 115–28. https://doi.org/10.1016/S0926–6410(02)00066–6Google Scholar
Morein-Zamir, S., Soto-Faraco, S., & Kingstone, A. (2003). Auditory capture of vision: examining temporal ventriloquism. Brain Research, 17(1), 154–63.Google Scholar
Morís Fernández, L., Macaluso, E., & Soto-Faraco, S. (2017). Audiovisual integration as conflict resolution: the conflict of the McGurk illusion. Human Brain Mapping, 38(11). https://doi.org/10.1002/hbm.23758Google Scholar
Morís Fernández, L., Torralba, M., & Soto-Faraco, S. (2018). Theta oscillations reflect conflict processing in the perception of the McGurk illusion. European Journal of Neuroscience. https://doi.org/10.1111/ejn.13804Google Scholar
Morís Fernández, L., Visser, M., Ventura-Campos, N., Ávila, C., & Soto-Faraco, S. (2015). Top-down attention regulates the neural expression of audiovisual integration. NeuroImage, 119. https://doi.org/10.1016/j.neuroimage.2015.06.052Google Scholar
Murray, M. M., Molholm, S., Michel, C. M., Heslenfeld, D. J., Ritter, W., Javitt, D. C., … Foxe, J. J. (2005). Grabbing your ear: rapid auditory–somatosensory multisensory interactions in low-level sensory cortices are not constrained by stimulus alignment. Cerebral Cortex, 15(7), 963–74. https://doi.org/10.1093/cercor/bhh197Google Scholar
Nahorna, O., Berthommier, F., & Schwartz, J.-L. (2012). Binding and unbinding the auditory and visual streams in the McGurk effect. The Journal of the Acoustical Society of America, 132(2), 1061–77. https://doi.org/10.1121/1.4728187Google Scholar
Nardo, D., Console, P., Reverberi, C., & Macaluso, E. (2016). Competition between visual events modulates the influence of salience during free-viewing of naturalistic videos. Frontiers in Human Neuroscience, 10, 320. https://doi.org/10.3389/fnhum.2016.00320Google Scholar
Nardo, D., Santangelo, V., & Macaluso, E. (2011). Stimulus-driven orienting of visuo-spatial attention in complex dynamic environments. Neuron, 69(5), 1015–28. https://doi.org/10.1016/J.NEURON.2011.02.020Google Scholar
Nardo, D., Santangelo, V., & Macaluso, E. (2014). Spatial orienting in complex audiovisual environments. Human Brain Mapping, 35(4), 1597–614. https://doi.org/10.1002/hbm.22276Google Scholar
Navarra, J., Alsius, A., Soto-Faraco, S., & Spence, C. (2010). Assessing the role of attention in the audiovisual integration of speech. Information Fusion, 11(1), 411. https://doi.org/10.1016/j.inffus.2009.04.001Google Scholar
Navarra, J., Vatakis, A., Zampini, M., Soto-Faraco, S., Humphreys, W., & Spence, C. (2005). Exposure to asynchronous audiovisual speech extends the temporal window for audiovisual integration. Brain Research, 25(2), 499507. https://doi.org/10.1016/J.COGBRAINRES.2005.07.009Google Scholar
Neisser, U. (1976). Cognition and reality. Principles and implication of cognitive psychology. San Francisco: WH Freeman and Company.Google Scholar
Neisser, U. (1982). Memory: what are the important questions? In, J. U. Neisser & Hyman, I. E. (eds.), Memory observed (pp. 318). New York: Worth.Google Scholar
Nelken, I., Bizley, J., Shamma, S. A., & Wang, X. (2014). Auditory cortical processing in real-world listening: the auditory system going real. The Journal of Neuroscience: The Official Journal of the Society for Neuroscience, 34(46), 15135–8. https://doi.org/10.1523/JNEUROSCI.2989–14.2014Google Scholar
Noesselt, T., Tyll, S., Boehler, C. N., Budinger, E., Heinze, H.-J., & Driver, J. (2010). Sound-induced enhancement of low-intensity vision: multisensory influences on human sensory-specific cortices and thalamic bodies relate to perceptual enhancement of visual detection sensitivity. The Journal of Neuroscience, 30(41), 13609 LP13623.Google Scholar
Noppeney, U., Josephs, O., Hocking, J., Price, C. J., & Friston, K. J. (2008). The effect of prior visual information on recognition of speech and sounds. Cerebral Cortex, 18(3), 598609. https://doi.org/10.1093/cercor/bhm091Google Scholar
Noppeney, U., & Lee, H. L. (2018). Causal inference and temporal predictions in audiovisual perception of speech and music. Annals of the New York Academy of Sciences, 1423(1), 102–16. https://doi.org/10.1111/nyas.13615Google Scholar
O’Regan, J. K. (1992). Solving the ‘real’ mysteries of visual perception: The world as an outside memory. Canadian Journal of Psychology/Revue Canadienne de Psychologie, 46(3), 461–88. https://doi.org/10.1037/h0084327Google Scholar
Odgaard, E. C., Arieh, Y., & Marks, L. E. (2003). Cross-modal enhancement of perceived brightness: sensory interaction versus response bias. Perception & Psychophysics, 65(1), 123–32. https://doi.org/10.3758/BF03194789Google Scholar
Ojanen, V., Möttönen, R., Pekkola, J., Jääskeläinen, I. P., Joensuu, R., Autti, T., & Sams, M. (2005). Processing of audiovisual speech in Broca’s area. Neuroimage, 25(2), 333–8.Google Scholar
Otto, T. U., & Mamassian, P. (2012). Noise and correlations in parallel perceptual decision making. Current Biology, 22(15), 1391–6. https://doi.org/10.1016/J.CUB.2012.05.031Google Scholar
Pannunzi, M., Pérez-Bellido, A., Pereda-Baños, A., López-Moliner, J., Deco, G., & Soto-Faraco, S. (2015). Deconstructing multisensory enhancement in detection. Journal of Neurophysiology, 113(6). https://doi.org/10.1152/jn.00341.2014Google Scholar
Pápai, M. S., & Soto-Faraco, S. (2017). Sounds can boost the awareness of visual events through attention without cross-modal integration. Scientific Reports, 7. https://doi.org/10.1038/srep41684Google Scholar
Papeo, L., Goupil, N. & Soto-Faraco, S. (2019, June 18). Visual search for people among people. PsychRxiv, https://doi.org/10.31234/osf.io/fupesGoogle Scholar
Parise, C. V., Knorre, K., & Ernst, M. O. (2014). Natural auditory scene statistics shapes human spatial hearing. Proceedings of the National Academy of Sciences of the United States of America, 111(16), 6104–8. https://doi.org/10.1073/pnas.1322705111Google Scholar
Parise, C. V., & Spence, C. (2009). ‘When birds of a feather flock together’: synesthetic correspondences modulate audiovisual integration in non-synesthetes. PLoS ONE, 4(5), e5664. https://doi.org/10.1371/journal.pone.0005664Google Scholar
Parise, C. V., Spence, C., & Ernst, M. O. (2012). When correlation implies causation in multisensory integration. Current Biology, 22(1), 46–9. https://doi.org/10.1016/J.CUB.2011.11.039Google Scholar
Peelen, M. V., & Kastner, S. (2014). Attention in the real world: toward understanding its neural basis. Trends in Cognitive Sciences, 18(5), 242–50. https://doi.org/10.1016/J.TICS.2014.02.004Google Scholar
Pekkola, J., Laasonen, M., Ojanen, V., Autti, T., Jääskeläinen, I. P., Kujala, T., & Sams, M. (2006). Perception of matching and conflicting audiovisual speech in dyslexic and fluent readers: an fMRI study at 3 T. NeuroImage, 29(3), 797807. https://doi.org/10.1016/J.NEUROIMAGE.2005.09.069Google Scholar
Pérez-Bellido, A., Soto-Faraco, S., & López-Moliner, J. (2013). Sound-driven enhancement of vision: disentangling detection-level from decision-level contributions. Journal of Neurophysiology, 109(4). https://doi.org/10.1152/jn.00226.2012Google Scholar
Pesquita, A., Brennan, A., Enns, J. T., & Soto-Faraco, S. (2013). Isolating shape from semantics in haptic-visual priming. Experimental Brain Research, 227(3), 311–22. https://doi.org/10.1007/s00221-013–3489–1Google Scholar
Proctor, R. W., & Lu, C.-H. (1999). Processing irrelevant location information: practice and transfer effects in choice-reaction tasks. Memory & Cognition, 27(1), 6377.Google Scholar
Puigcerver, L., Gonzalez-Contijoch, A., Nannen, P., Termes, M., Correa, G., Egea-Castillo, N., … Navarra, J. (2018). Testing new strategies to reduce malnutrition in child and adolescent cancer patients under chemotherapy treatment. In Joint Congress of SEPEX-SEPNECA-AIP, Madrid 3–6 July. (p. 121). Madrid.Google Scholar
Quick, R. F. (1974). A vector-magnitude model of contrast detection. Kybernetik, 16(2), 65–7. https://doi.org/10.1007/BF00271628Google Scholar
Reales, J. M., & Ballesteros, S. (1999). Implicit and explicit memory for visual and haptic objects: Cross-modal priming depends on structural descriptions. Journal of Experimental Psychology: Learning, Memory, and Cognition, 25(3), 644.Google Scholar
Riggio, L., de Gonzaga Gawryszewski, L., & Umilta, C. (1986). What is crossed in crossed-hand effects? Acta Psychologica, 62(1), 89100.Google Scholar
Risko, E. F., Laidlaw, K., Freeth, M., Foulsham, T., & Kingstone, A. (2012). Social attention with real versus reel stimuli: toward an empirical approach to concerns about ecological validity. Frontiers in Human Neuroscience, 6, 143. https://doi.org/10.3389/fnhum.2012.00143Google Scholar
Roa Romero, Y., Senkowski, D., & Keil, J. (2015). Early and late beta-band power reflect audiovisual perception in the McGurk illusion. Journal of Neurophysiology, 113(7), 2342–50. https://doi.org/10.1152/jn.00783.2014Google Scholar
Roberts, K. L., & Hall, D. A. (2008). Examining a supramodal network for conflict processing: a systematic review and novel functional magnetic resonance imaging data for related visual and auditory stroop tasks. Journal of Cognitive Neuroscience, 20(6), 1063–78.Google Scholar
Rockland, K. S., & Ojima, H. (2003). Multisensory convergence in calcarine visual areas in macaque monkey. International Journal of Psychophysiology, 50(1–2), 1926. https://doi.org/10.1016/S0167–8760(03)00121–1Google Scholar
Röder, B., Kusmierek, A., Spence, C., & Schicke, T. (2007). Developmental vision determines the reference frame for the multisensory control of action. Proceedings of the National Academy of Sciences, 104(11), 4753–8.Google Scholar
Rohe, T., & Noppeney, U. (2015). Cortical hierarchies perform Bayesian causal inference in multisensory perception. PLOS Biology, 13(2), e1002073. https://doi.org/10.1371/journal.pbio.1002073Google Scholar
Romei, V., Gross, J., & Thut, G. (2010). On the role of prestimulus alpha rhythms over occipito-parietal areas in visual input regulation: correlation or causation? The Journal of Neuroscience, 30(25), 8692 LP8697.Google Scholar
Romei, V., Gross, J., & Thut, G. (2012). Sounds reset rhythms of visual cortex and corresponding human visual perception. Current Biology, 22(9), 807–13. https://doi.org/10.1016/J.CUB.2012.03.025Google Scholar
Romei, V., Murray, M. M., Cappe, C., & Thut, G. (2009). Preperceptual and stimulus-selective enhancement of low-level human visual cortex excitability by sounds. Current Biology, 19(21), 1799–805. https://doi.org/10.1016/J.CUB.2009.09.027Google Scholar
Roseboom, W., Kawabe, T., & Nishida, S. (2013). The cross-modal double flash illusion depends on featural similarity between cross-modal inducers. Scientific Reports, 3(1), 3437. https://doi.org/10.1038/srep03437Google Scholar
Roseboom, W., Nishida, S., Fujisaki, W., & Arnold, D. H. (2011). Audio-visual speech timing sensitivity is enhanced in cluttered conditions. PLoS ONE, 6(4), e18309. https://doi.org/10.1371/journal.pone.0018309Google Scholar
Ross, L. A., Saint-Amour, D., Leavitt, V. M., Javitt, D. C., & Foxe, J. J. (2006). Do you see what I am saying? Exploring visual enhancement of speech comprehension in noisy environments. Cerebral Cortex, 17(5), 1147–53. https://doi.org/10.1093/cercor/bhl024Google Scholar
Roswarski, T. E., & Proctor, R. W. (2000). Auditory stimulus-response compatibility: Is there a contribution of stimulus–hand correspondence? Psychological Research, 63(2), 148–58.Google Scholar
Rummukainen, O., Radun, J., Virtanen, T., & Pulkki, V. (2014). Categorization of natural dynamic audiovisual scenes. PLoS ONE, 9(5), e95848. https://doi.org/10.1371/journal.pone.0095848Google Scholar
Ruzzoli, M., & Soto-Faraco, S. (2014). Alpha stimulation of the human parietal cortex attunes tactile perception to external space. Current Biology, 24(3). https://doi.org/10.1016/j.cub.2013.12.029Google Scholar
Ruzzoli, M., & Soto-Faraco, S. (2017). Modality-switching in the Simon task: the clash of reference frames. Journal of Experimental Psychology: General, 146(10). https://doi.org/10.1037/xge0000342Google Scholar
Sams, M., Tiippana, K., Puharinen, H., & Möttönen, R. (2011). Sound location can influence audiovisual speech perception when spatial attention is manipulated. Seeing and Perceiving, 24(1), 6790. https://doi.org/10.1163/187847511X557308Google Scholar
Sánchez-García, C., Enns, J. T., & Soto-Faraco, S. (2013). Cross-modal prediction in speech depends on prior linguistic experience. Experimental Brain Research, 225(4). https://doi.org/10.1007/s00221-012–3390–3Google Scholar
Sánchez-García, C., Kandel, S., Savariaux, C., & Soto-Faraco, S. (2018). The time course of audio-visual phoneme identification: a high temporal resolution study. Multisensory Research, 31(1–2), 5778. https://doi.org/10.1163/22134808%9600002560Google Scholar
Santangelo, V., Di Francesco, S. A., Mastroberardino, S., & Macaluso, E. (2015). Parietal cortex integrates contextual and saliency signals during the encoding of natural scenes in working memory. Human Brain Mapping, 36(12), 5003–17. https://doi.org/10.1002/hbm.22984Google Scholar
Santangelo, V., & Spence, C. (2007). Multisensory cues capture spatial attention regardless of perceptual load. Journal of Experimental Psychology: Human Perception and Performance, 33(6), 1311–21. https://doi.org/10.1037/0096–1523.33.6.1311Google Scholar
Schneider, T. R., Engel, A. K., & Debener, S. (2008). Multisensory identification of natural objects in a two-way crossmodal priming paradigm. Experimental Psychology, 55(2), 121–32. https://doi.org/10.1027/1618–3169.55.2.121Google Scholar
Schroeder, C. E., & Lakatos, P. (2009). Low-frequency neuronal oscillations as instruments of sensory selection. Trends in Neurosciences, 32(1), 918. https://doi.org/10.1016/J.TINS.2008.0 9.012Google Scholar
Schroeder, C. E., Lakatos, P., Kajikawa, Y., Partan, S., & Puce, A. (2008). Neuronal oscillations and visual amplification of speech. Trends in Cognitive Sciences, 12(3), 106–13. https://doi.org/10.1016/J.TICS.2008.0 1.002Google Scholar
Scott, J. J., & Gray, R. (2008). A comparison of tactile, visual, and auditory warnings for rear-end collision prevention in simulated driving. Human Factors: The Journal of the Human Factors and Ergonomics Society, 50(2), 264–75. https://doi.org/10.1518/001872008X250674Google Scholar
Senkowski, D., Saint-Amour, D., Gruber, T., & Foxe, J. J. (2008). Look who’s talking: the deployment of visuo-spatial attention during multisensory speech processing under noisy environmental conditions. NeuroImage, 43(2), 379–87. https://doi.org/10.1016/J.NEUROIMAGE.2008.0 6.046Google Scholar
Senkowski, D., Talsma, D., Herrmann, C. S., & Woldorff, M. G. (2005). Multisensory processing and oscillatory gamma responses: effects of spatial selective attention. Experimental Brain Research, 166(3–4), 411–26. https://doi.org/10.1007/s00221-005–2381–zGoogle Scholar
Shams, L., & Kim, R. (2010). Crossmodal influences on visual perception. Physics of Life Reviews, 7(3), 269–84. https://doi.org/10.1016/JPLREV.2010.0 4.006Google Scholar
Shams, L., & Seitz, A. R. (2008). Benefits of multisensory learning. Trends in Cognitive Sciences, 12(11), 411–17. https://doi.org/10.1016/J.TICS.2008.0 7.006Google Scholar
Shenhav, A., Botvinick, M. M., & Cohen, J. D. (2013). The expected value of control: an integrative theory of anterior cingulate cortex function. Neuron, 79(2), 217–40. https://doi.org/10.1016/J.NEURON.2013.0 7.007Google Scholar
Simon, J. R., Hinrichs, J. V, & Craft, J. L. (1970). Auditory SR compatibility: reaction time as a function of ear–hand correspondence and ear–response–location correspondence. Journal of Experimental Psychology, 86(1), 97102.Google Scholar
Simon, J. R., & Small, A. M. Jr (1969). Processing auditory information: interference from an irrelevant cue. Journal of Applied Psychology, 53(5), 433.Google Scholar
Skipper, J. I. (2014). Echoes of the spoken past: how auditory cortex hears context during speech perception. Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences, 369(1651), 20130297. https://doi.org/10.1098/rstb.2013.0297Google Scholar
Skipper, J. I., Goldin-Meadow, S., Nusbaum, H. C., & Small, S. L. (2009). Gestures orchestrate brain networks for language understanding. Current Biology, 19(8), 661–7. https://doi.org/10.1016/J.CUB.2009.02.051Google Scholar
Skipper, J. I., Van Wassenhove, V., Nusbaum, H. C., & Small, S. L. (2007). Hearing lips and seeing voices: how cortical areas supporting speech production mediate audiovisual speech perception. Cerebral Cortex, 17(10), 2387–99.Google Scholar
Smilek, D., Birmingham, E., Cameron, D., Bischof, W., & Kingstone, A. (2006). Cognitive ethology and exploring attention in real-world scenes. Brain Research, 1080(1), 101–19. https://doi.org/10.1016/J.BRAINRES.2005.1 2.090Google Scholar
Smilek, D., Eastwood, J. D., Reynolds, M. G., & Kingstone, A. (2007). Metacognitive errors in change detection: missing the gap between lab and life. Consciousness and Cognition, 16(1), 52–7. https://doi.org/10.1016/J.CONCOG.2006.04.001Google Scholar
Smith, R. E., MacKenzie, S. B., Yang, X., Buchholz, L. M., & Darley, W. K. (2007). Modeling the determinants and effects of creativity in advertising. Marketing Science, 26(6), 819–33.Google Scholar
So, W. C., Sim Chen-Hui, C., & Low Wei-Shan, J. (2012). Mnemonic effect of iconic gesture and beat gesture in adults and children: is meaning in gesture important for memory recall? Language and Cognitive Processes, 27(5), 665–81. https://doi.org/10.1080/01690965.2011.573220Google Scholar
Soto-Faraco, S., Morein-Zamir, S., & Kingstone, A. (2005). On audiovisual spatial synergy: the fragility of the phenomenon, Perception & Psychophysics, 67(3), 444–57. https://doi.org/10.3758/BF03193323Google Scholar
Soto-Faraco, S., Navarra, J., & Alsius, A. (2004). Assessing automaticity in audiovisual speech integration: evidence from the speeded classification task. Cognition, 92(3), B13B23. https://doi.org/10.1016/j.cognition.2003.10.005Google Scholar
Spence, C. (2011). Crossmodal correspondences: A tutorial review. Attention, Perception, & Psychophysics, 73(4), 971–95. https://doi.org/10.3758/s13414-010–0073–7Google Scholar
Spence, C. (2016). Multisensory packaging design: color, shape, texture, sound, and smell. In Burgess, P. (Ed.), Integrating the packaging and product experience in food and beverages: a road-map to consumer satisfaction (pp. 122). Woodhead Publishing. https://doi.org/10.1016/B978–0–08–100356–5.00001–2Google Scholar
Spence, C. (2018). Multisensory perception. In Wixed, J. (Ed.-in-chief) & Serences, J. (Vol. Ed.) (Eds.), Stevens’ handbook of experimental psychology and cognitive neuroscience (4th ed., pp. 156). Hoboken, NJ: John Wiley & Sons, Inc. https://doi.org/10.1002/9781119170174.epcn214Google Scholar
Spence, C., & Driver, J. (2004). Crossmodal space and crossmodal attention. Oxford University Press.Google Scholar
Spence, C., & Ho, C. (2008). Tactile and multisensory spatial warning signals for drivers. IEEE Transactions on Haptics, 1(2), 121–9. https://doi.org/10.1109/TOH.2008.14Google Scholar
Spence, C., & Ho, C. (2015a). Crossmodal attention: from the laboratory to the real world (and back again). In Fawcet, J. M., Risko, E. F., & Kingstone, A. (eds.), The handbook of attention (pp. 119–38). Cambridge, MA: Massachusetts Institute of Technology Press.Google Scholar
Spence, C., & Ho, C. (2015b). Multisensory information processing. In Boehm-Davis, D. A., Durso, F. T., & Lee, J. D. (eds.), APA handbook of human systems integration (pp. 435–48). Washington: American Psychological Association. https://doi.org/10.1037/14528–027Google Scholar
Spence, C., & Santangelo, V. (2009). Capturing spatial attention with multisensory cues: a review. Hearing Research, 258(1–2), 134–42. https://doi.org/10.1016/j.heares.2009.04.015Google Scholar
Spence, C., & Soto-Faraco, S. (in press). Crossmodal attention applied: lessons for/from driving. In Chun, M. (Ed.), Cambridge elements of attention. Cambridge, UK: Cambridge University Press.Google Scholar
Spence, C., & Squire, S. (2003). Multisensory integration: maintaining the perception of synchrony. Current Biology, 13(13), R519R521. https://doi.org/10.1016/S0960–9822(03)00445–7Google Scholar
Spiers, H. J., & Maguire, E. A. (2006). Thoughts, behaviour, and brain dynamics during navigation in the real world. NeuroImage, 31(4), 1826–40. https://doi.org/10.1016/J.NEUROIMAGE.2006.01.037Google Scholar
Stein, B. E. (Ed.). (2012). The new handbook of multisensory processes. Cambridge, MA: Massachusetts Institute of Technology Press.Google Scholar
Stein, B. E., London, N., Wilkinson, L. K., & Price, D. D. (1996). Enhancement of perceived visual intensity by auditory stimuli: a psychophysical analysis. Journal of Cognitive Neuroscience, 8(6), 497506. https://doi.org/10.1162/jocn.1996.8.6.497Google Scholar
Stein, B. E., & Meredith, M. A. (1993). The merging of the senses. Cambridge, MA: Massachusetts Institute of Technology Press.Google Scholar
Stein, B. E., & Stanford, T. R. (2008). Multisensory integration: current issues from the perspective of the single neuron. Nature Reviews Neuroscience, 9(4), 255–66. https://doi.org/10.1038/nrn2331Google Scholar
Stekelenburg, J. J., & Vroomen, J. (2007). Neural correlates of multisensory integration of ecologically valid audiovisual events. Journal of Cognitive Neuroscience, 19(12), 1964–73. https://doi.org/10.1162/jocn.2007.19.12.1964Google Scholar
Stekelenburg, J. J., & Vroomen, J. (2012). Electrophysiological correlates of predictive coding of auditory location in the perception of natural audiovisual events. Frontiers in Integrative Neuroscience, 6, 26.Google Scholar
Stoffer, T. H. (1991). Attentional focussing and spatial stimulus-response compatibility. Psychological Research, 53(2), 127–35.Google Scholar
Stroop, J. R. (1935). Studies of interference in serial verbal reactions. Journal of Experimental Psychology, 18(6), 643–62. https://doi.org/10.1037/h0054651Google Scholar
Suied, C., Bonneel, N., & Viaud-Delmon, I. (2009). Integration of auditory and visual information in the recognition of realistic objects. Experimental Brain Research, 194(1), 91102. https://doi.org/10.1007/s00221-008–1672–6CrossRefGoogle ScholarPubMed
Sumby, W. H., & Pollack, I. (1954). Visual contribution to speech intelligibility in noise. The Journal of the Acoustical Society of America, 26(2), 212–15. https://doi.org/10.1121/1.1907309Google Scholar
Szycik, G. R., Jansma, H., & Münte, T. F. (2009). Audiovisual integration during speech comprehension: an fMRI study comparing ROI‐based and whole brain analyses. Human Brain Mapping, 30(7), 1990–9.Google Scholar
Talsma, D. (2015). Predictive coding and multisensory integration: an attentional account of the multisensory mind. Frontiers in Integrative Neuroscience, 09, 19. https://doi.org/10.3389/fnint.2015.00019Google Scholar
Talsma, D., Doty, T. J., & Woldorff, M. G. (2006). Selective attention and audiovisual integration: is attending to both modalities a prerequisite for early integration? Cerebral Cortex, 17(3), 679–90. https://doi.org/10.1093/cercor/bhk016Google Scholar
Talsma, D., Senkowski, D., Soto-Faraco, S., & Woldorff, M. G. (2010). The multifaceted interplay between attention and multisensory integration. Trends in Cognitive Sciences, 14(9), 400–10. https://doi.org/10.1016/j.tics.2010.06.008Google Scholar
Talsma, D., & Woldorff, M. G. (2005). Selective attention and multisensory integration: multiple phases of effects on the evoked brain activity. Journal of Cognitive Neuroscience, 17(7), 1098–114. https://doi.org/10.1162/0898929054475172Google Scholar
ten Oever, S., Romei, V., van Atteveldt, N., Soto-Faraco, S., Murray, M. M., & Matusz, P. J. (2016). The COGs (context, object, and goals) in multisensory processing. Experimental Brain Research, 234(5). https://doi.org/10.1007/s00221-016–4590–zGoogle Scholar
Thorne, J. D., De Vos, M., Viola, F. C., & Debener, S. (2011). Cross-modal phase reset predicts auditory task performance in humans. The Journal of Neuroscience: The Official Journal of the Society for Neuroscience, 31(10), 38533861. https://doi.org/10.1523/JNEUROSCI.6176–10.2011Google Scholar
Thut, G., Veniero, D., Romei, V., Miniussi, C., Schyns, P., & Gross, J. (2011). Rhythmic TMS causes local entrainment of natural oscillatory signatures. Current Biology, 21(14), 1176–85. https://doi.org/10.1016/J.CUB.2011.05.049Google Scholar
Tiippana, K. (2014). What is the McGurk effect? Frontiers in Psychology, 5, 725.Google Scholar
Tiippana, K., Andersen, T. S., & Sams, M. (2004). Visual attention modulates audiovisual speech perception. European Journal of Cognitive Psychology, 16(3), 457–72. https://doi.org/10.1080/09541440340000268Google Scholar
Umiltà, C., & Liotti, M. (1987). Egocentric and relative spatial codes in SR compatibility. Psychological Research, 49(2–3), 8190.Google Scholar
Urbantschitsch, V. (1880). Beobachtungen über centrale Acusticusaffectionen. Archiv Für Ohrenheilkunde, 16(3), 171–87.Google Scholar
van Atteveldt, N., Murray, M. M., Thut, G., & Schroeder, C. E. (2014). Multisensory integration: flexible use of general operations. Neuron, 81(6), 1240–53. https://doi.org/10.1016/J.NEURON.2014.02.044Google Scholar
van de Par, S., & Kohlrausch, A. (2004). Visual and auditory object selection based on temporal correlations between auditory and visual cues. In 18th Int. Congress on Acoustics (pp. 49). Kyoto.Google Scholar
Van Der Burg, E., Olivers, C. N. L., Bronkhorst, A. W., & Theeuwes, J. (2008). Pip and pop: nonspatial auditory signals improve spatial visual search. Journal of Experimental Psychology: Human Perception and Performance, 34(5), 1053–65. https://doi.org/10.1037/0096–1523.34.5.1053Google Scholar
Van der Burg, E., Talsma, D., Olivers, C. N. L., Hickey, C., & Theeuwes, J. (2011). Early multisensory interactions affect the competition among multiple visual objects. NeuroImage, 55(3), 1208–18. https://doi.org/10.1016/J.NEUROIMAGE.2010.12.068Google Scholar
van Wassenhove, V., Grant, K. W., & Poeppel, D. (2005). Visual speech speeds up the neural processing of auditory speech. Proceedings of the National Academy of Sciences of the United States of America, 102(4), 1181–6. https://doi.org/10.1073/pnas.0408949102Google Scholar
van Wassenhove, V., Grant, K. W., & Poeppel, D. (2007). Temporal window of integration in auditory-visual speech perception. Neuropsychologia, 45(3), 598607. https://doi.org/10.1016/J.NEUROPSYCHOLOGIA.2006.01.001Google Scholar
Varela, F., Toro, A., John, E., & Schwartz, E. (1981). Perceptual framing and cortical alpha rhythm. Neuropsychologia, 19:5.Google Scholar
Vatakis, A., & Spence, C. (2006). Audiovisual synchrony perception for music, speech, and object actions. Brain Research, 1111(1), 134–42. https://doi.org/10.1016/J.BRAINRES.2006.05.078Google Scholar
Vatakis, A., & Spence, C. (2007). Crossmodal binding: evaluating the ‘unity assumption’ using audiovisual speech stimuli. Perception & Psychophysics, 69(5), 744–56. https://doi.org/10.3758/BF03193776Google Scholar
Vercillo, T., Tonelli, A., Goodale, M., & Gori, M. (2017). Restoring an allocentric reference frame in blind individuals through echolocation. The Journal of the Acoustical Society of America, 141(5), 3453. https://doi.org/10.1121/1.4987160Google Scholar
Vroomen, J., Bertelson, P., & de Gelder, B. (2001). Directing spatial attention towards the illusory location of a ventriloquized sound. Acta Psychologica, 108(1), 2133. https://doi.org/10.1016/S0001–6918(00)00068–8Google Scholar
Vroomen, J., & Gelder, B. de. (2000). Sound enhances visual perception: Cross-modal effects of auditory organization on vision. Journal of Experimental Psychology: Human Perception and Performance. American Psychological Association. https://doi.org/10.1037/0096–1523.26.5.1583Google Scholar
Vroomen, J., & Keetels, M. (2010). Perception of intersensory synchrony: a tutorial review. Attention, Perception, & Psychophysics, 72(4), 871–84. https://doi.org/10.3758/APP.72.4.871Google Scholar
Wallace, R. J. (1972). Spatial SR compatibility effects involving kinesthetic cues. Journal of Experimental Psychology, 93(1), 163–8.Google Scholar
Ward, L. M. (2002). Dynamical cognitive science. Cambridge, MA: Massachusetts Institute of Technology Press.Google Scholar
Weissman, D. H., Giesbrecht, B., Song, A. W., Mangun, G. R., & Woldorff, M. G. (2003). Conflict monitoring in the human anterior cingulate cortex during selective attention to global and local object features. Neuroimage, 19(4), 1361–8.Google Scholar
Welch, R. B. (1999). Chapter 15: meaning, attention, and the ‘unity assumption’ in the intersensory bias of spatial and temporal perceptions. Advances in Psychology, 129, 371–87. https://doi.org/10.1016/S0166–4115(99)80036–3Google Scholar
Welch, R. B., DuttonHurt, L. D., & Warren, D. H. (1986). Contributions of audition and vision to temporal rate perception. Perception & Psychophysics, 39(4), 294300. https://doi.org/10.3758/BF03204939Google Scholar
Wiggs, C. L., & Martin, A. (1998). Properties and mechanisms of perceptual priming. Current Opinion in Neurobiology, 8, 227–33.Google Scholar
Wolfe, J. M., Horowitz, T. S., & Kenner, N. M. (2005). Rare items often missed in visual searches. Nature, 435(7041), 439–40. https://doi.org/10.1038/435439aGoogle Scholar
Wu, C.-C., Wick, F. A., & Pomplun, M. (2014). Guidance of visual attention by semantic information in real-world scenes. Frontiers in Psychology, 5, 54. https://doi.org/10.3389/fpsyg.2014.00054Google Scholar
Yamamoto, S., & Kitazawa, S. (2001). Reversal of subjective temporal order due to arm crossing. Nature Neuroscience, 4, 759–65.Google Scholar
Yarrow, K., Roseboom, W., & Arnold, D. H. (2011). Spatial grouping resolves ambiguity to drive temporal recalibration. Journal of Experimental Psychology. Human Perception and Performance, 37(5), 1657–61. https://doi.org/10.1037/a0024235Google Scholar
Yau, J. M., Olenczak, J. B., Dammann, J. F., & Bensmaia, S. J. (2009). Temporal frequency channels are linked across audition and touch. Current Biology, 19(7), 561–6. https://doi.org/10.1016/J.CUB.2009.02.013Google Scholar
Zion Golumbic, E. M., Ding, N., Bickel, S., Lakatos, P., Schevon, C. A., McKhann, G. M., … Schroeder, C. E. (2013). Mechanisms underlying selective neuronal tracking of attended speech at a ‘cocktail party’. Neuron, 77(5), 980–91. https://doi.org/10.1016/J.NEURON.2012.12.037Google Scholar

Save element to Kindle

To save this element to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Multisensory Interactions in the Real World
Available formats
×

Save element to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Multisensory Interactions in the Real World
Available formats
×

Save element to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Multisensory Interactions in the Real World
Available formats
×