Skip to main content Accessibility help
×
Hostname: page-component-7c8c6479df-ph5wq Total loading time: 0 Render date: 2024-03-29T16:00:49.252Z Has data issue: false hasContentIssue false

Crossmodal Attention Applied

Lessons for and from Driving

Published online by Cambridge University Press:  17 August 2020

Charles Spence
Affiliation:
University of Oxford
Salvador Soto-Faraco
Affiliation:
ICREA and Universitat Pompeu Fabra

Summary

Cognitive neuroscientists have started to uncover the neural substrates, systems, and mechanisms enabling us to prioritize the processing of certain sensory information over other, currently less-relevant, inputs. However, there is still a large gap between the knowledge generated in the laboratory and its application to real-life problems of attention as when, for example, interface operators are multi-tasking. In this Element, laboratory studies on crossmodal attention (both behavioural/psychophysical and cognitive neuroscience) are situated within the applied context of driving. We contrast the often idiosyncratic conditions favoured by much of the laboratory research, typically using a few popular paradigms involving simplified experimental conditions, with the noisy, multisensory, real-world environments filled with complex, intrinsically-meaningful stimuli. By drawing attention to the differences between basic and applied studies in the context of driving, we highlight a number of important issues and neglected areas of research as far as the study of crossmodal attention is concerned.
Get access
Type
Element
Information
Online ISBN: 9781108919951
Publisher: Cambridge University Press
Print publication: 10 September 2020

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Abrams, R. A., Davoli, C. C., Du, F., Knapp, W. H., III, & Paull, D. (2008). Altered vision near the hands. Cognition, 107, 1035–47.CrossRefGoogle ScholarPubMed
Ahtamad, M., Gray, R., Ho, C., Reed, N., & Spence, C. (2015). Informative collision warnings: Effect of modality and driver age. In Proceedings of the 8th International Driving Symposium on Human Factors in Driver Assessment, Training and Vehicle Design (pp. 323–9). June, Salt Lake City, UT.CrossRefGoogle Scholar
Ahveninen, J., Ingalls, G., Yildirim, F., Calabro, F. J., & Vaina, L. M. (2019). Peripheral visual localization is degraded by globally incongruent auditory-spatial attention cues. Experimental Brain Research, 237, 2137–43.Google Scholar
Alais, D., & Burr, D. (2004). The ventriloquist effect results from near-optimal bimodal integration. Current Biology, 14, 257–62.Google Scholar
Alais, D., Morrone, C., & Burr, D. (2006). Separate attentional resources for vision and audition. Proceedings of the Royal Society B, 273, 1339–45.Google ScholarPubMed
Allen, F., & Schwartz, M. (1940). The effect of stimulation of the senses of vision, hearing, taste, and smell upon the sensibility of the organs of vision. Journal of General Physiology, 24, 105–21.Google Scholar
Alsius, A., Möttönen, R., Sams, M. E., Soto-Faraco, S., & Tiippana, K. (2014). Effect of attentional load on audiovisual speech perception: Evidence from ERPs. Frontiers in Psychology, 5, 727.CrossRefGoogle ScholarPubMed
Alsius, A., Navarra, J., Campbell, R., & Soto-Faraco, S. (2005). Audiovisual integration of speech falters under high attention demands. Current Biology, 15, 15.CrossRefGoogle ScholarPubMed
Alsius, A., Navarra, J., & Soto-Faraco, S. (2007). Attention to touch weakens audiovisual speech integration. Experimental Brain Research, 183, 399404.CrossRefGoogle ScholarPubMed
Amado, C., Kovács, P., Mayer, R., Ambrus, G. G., Trapp, S., & Kovács, G. (2018). Neuroimaging results suggest the role of prediction in cross-domain priming. Scientific Reports, 8, 10356.CrossRefGoogle ScholarPubMed
Amado, S., & Ulupinar, P. (2005). The effects of conversation on attention and peripheral detection: Is talking with a passenger and talking on the cell phone different? Transportation Research Part F: Traffic Psychology and Behaviour, 8, 383–95.Google Scholar
Andersen, T. S., Tiippana, K., & Sams, M. (2004). Factors influencing audiovisual fission and fusion illusions. Cognitive Brain Research, 21, 301–8.Google Scholar
Andersen, T. S., Tiippana, K., & Sams, M. (2005). Maximum likelihood integration of rapid flashes and beeps. Neuroscience Letters, 380, 155–60.Google Scholar
Arnell, K. M. (2006). Visual, auditory, and cross-modality dual-task costs: Electrophysiological evidence for an amodal bottleneck on working memory consolidation. Perception & Psychophysics, 68, 447–57.Google Scholar
Arnell, K. M., & Jenkins, R. (2004). Revisiting within modality and cross-modality attentional blinks: Effects of target-distractor similarity. Perception & Psychophysics, 66, 1147–61.CrossRefGoogle ScholarPubMed
Arnell, K. M., & Larson, J. M. (2002). Cross-modality attentional blink without preparatory task-set switching. Psychonomic Bulletin & Review, 9, 497506.CrossRefGoogle ScholarPubMed
Arrighi, R., Lunardi, R., & Burr, D. (2011). Vision and audition do not share attentional resources in sustained tasks. Frontiers in Psychology, 2:56, 14.CrossRefGoogle Scholar
Ashley, S. (2001). Driving the info highway. Scientific American, 285(4), 4450.Google Scholar
Atmaca, S., Sebanz, N., & Knoblich, G. (2011). The joint flanker effect: Sharing tasks with real and imagined co-actors. Experimental Brain Research, 211, 371–85. http://doi.org/10.1007/s00221-011-2709-9Google Scholar
Azañón, E., Camacho, K., & Soto‐Faraco, S. (2010). Tactile remapping beyond space. European Journal of Neuroscience, 31, 1858–67.CrossRefGoogle ScholarPubMed
Azañón, E., & Soto-Faraco, S. (2008a). Changing reference frames during the encoding of tactile events. Current Biology, 18, 1044–9.CrossRefGoogle ScholarPubMed
Azañón, E., & Soto-Faraco, S. (2008b). Spatial remapping of tactile events: Assessing the effects of frequent posture changes. Communicative & Integrative Biology, 1, 45–6.CrossRefGoogle ScholarPubMed
Azañón, E., Stenner, M. P., Cardini, F., & Haggard, P. (2015). Dynamic tuning of tactile localization to body posture. Current Biology, 25, 512–17.Google Scholar
Badde, S., Röder, B., & Heed, T. (2019). Feeling a touch to the hand on the foot. Current Biology, 29, 1491–7.CrossRefGoogle Scholar
Badia, P., Wesensten, N., Lammers, W., Culpepper, J., & Harsh, J. (1990). Responsiveness to olfactory stimuli presented in sleep. Physiology & Behavior, 48, 8790.CrossRefGoogle ScholarPubMed
Baier, B., Kleinschmidt, A., & Müller, N. (2006). Cross-modal processing in early visual and auditory cortices depends on the statistical relation of multisensory information. Journal of Neuroscience, 26, 12260–5.CrossRefGoogle Scholar
Baldwin, C. L. (2011). Verbal collision avoidance messages during simulated driving: Perceived urgency, alerting effectiveness and annoyance. Ergonomics, 54, 328–37.Google Scholar
Baldwin, C. L., Spence, C., Bliss, J. P., Brill, J. C., Wogalter, M. S., Mayhorn, C. B., & Ferris, T. K. (2012). Multimodal cueing: The relative benefits of the auditory, visual, and tactile channels in complex environments. Proceedings of the 56th Human Factors and Ergonomics Society meeting, 56, 1431–5.Google Scholar
Baron, R. A., & Kalsher, M. J. (1998). Effects of a pleasant ambient fragrance on simulated driving performance: The sweet smell of … safety? Environment and Behavior, 30, 535–52.CrossRefGoogle Scholar
Battaglia, P. W., Jacobs, R. A., & Aslin, R. N. (2003). Bayesian integration of visual and auditory signals for spatial localization. Journal of the Optical Society of America A, 20, 1391–7.CrossRefGoogle ScholarPubMed
Battistoni, E., Kaiser, D., Hickey, C., & Peelen, M. V. (2018). Spatial attention follows category-based attention during naturalistic visual search: Evidence from MEG decoding. BioRxiv, 390807.Google Scholar
Bauer, M., Kennett, S., & Driver, J. (2012). Attentional selection of location and modality in vision and touch modulates low-frequency activity in associated sensory cortices. Journal of Neurophysiology, 107, 2342–51.CrossRefGoogle ScholarPubMed
Baumeister, R. F., Vohs, K. D., & Funder, D. C. (2007). Psychology as the science of self-reports and finger movements: Whatever happened to actual behavior? Perspectives on Psychological Science, 2, 396403.Google Scholar
Begault, D. R., & Pittman, M. T. (1996). Three-dimensional audio versus head-down traffic alert and collision avoidance system displays. International Journal of Aviation Psychology, 6, 7993.CrossRefGoogle ScholarPubMed
Belz, S. M., Robinson, G. S., & Casali, J. G. (1999). A new class of auditory warning signals for complex systems: Auditory icons. Human Factors, 41, 608–18.Google Scholar
Benoni, H., & Tsal, Y. (2010). Where have we gone wrong? Perceptual load does not affect selective attention. Vision Research, 50, 1292–8.CrossRefGoogle Scholar
Benoni, H., & Tsal, Y. (2013). Conceptual and methodological concerns in the theory of perceptual load. Frontiers in Psychology, 4, 522. http://doi.org/10.3389/fpsyg.2013.00522CrossRefGoogle ScholarPubMed
Berger, A., Henik, A., & Rafal, R. (2005). Competition between endogenous and exogenous orienting of visual attention. Journal of Experimental Psychology: General, 134, 207–21. http://doi.org/10.1037/0096-3445.134.2.207Google Scholar
Berman, R. A., & Colby, C. L. (2002). Auditory and visual attention modulate motion processing in area MT+. Cognitive Brain Research, 14, 6474.CrossRefGoogle ScholarPubMed
Bertelson, P., Vroomen, J., de Gelder, B., & Driver, J. (2000). The ventriloquist effect does not depend on the direction of deliberate visual attention. Perception & Psychophysics, 62, 321–32.CrossRefGoogle Scholar
Biondi, F., Leo, M., Gastaldi, M., Rossi, R., & Mulatti, C. (2016). How to drive drivers nuts: Effects of auditory, vibrotactile, and multimodal warnings on perceived urgency, annoyance and acceptability. Presentation at the Transportation Research Board 96th Annual Meeting, February.Google Scholar
Biondi, F., Strayer, D., Rossi, R., Gastaldi, M., & Mulatti, C. (2017). Advanced driver assistance systems: Using multimodal redundant warnings to enhance road safety. Applied Ergonomics, 58, 238–44.Google Scholar
Blurton, S. P., Greenlee, M. W., & Gondan, M. (2015). Cross-modal cueing in audiovisual spatial attention. Attention, Perception, & Psychophysics, 77, 2356–76.Google Scholar
Blustein, D., Gill, S., Wilson, A., & Sensinger, J. (2019). Crossmodal congruency effect scores decrease with repeat test exposure. PeerJ, 7, e6976. http://doi.org/10.7717/peerj.6976CrossRefGoogle ScholarPubMed
Bollimunta, A., Bogadhi, A. R., & Krauzlis, R. J. (2018). Comparing frontal eye field and superior colliculus contributions to covert spatial attention. Nature Communications, 9, 3553. http://doi.org/10.1038/s41467-018-06042-2Google Scholar
Bolognini, N., Frassinetti, F., Serino, A., & Làdavas, E. (2005). ‘Acoustical vision’ of below threshold stimuli: Interaction among spatially converging audiovisual inputs. Experimental Brain Research, 160, 273–82.Google Scholar
Brang, D., Towle, V. L., Suzuki, S., Hillyard, S. A., Di Tusa, S., Dai, Z., Tao, J., Wu, S., & Grabowecky, M. (2015). Peripheral sounds rapidly activate visual cortex: Evidence from electrocorticography. Journal of Neurophysiology, 114, 3023–8. http://doi.org/10.1152/jn.00728.2015Google Scholar
Brewster, S., & Brown, L. M. (2004). Tactons: Structured tactile messages for non-visual information display. Proceedings of the Fifth Australasian User Interface Conference (AUIC ’04), pp. 15–24. Dunedin, New Zealand: Australian Computer Society. Conferences in Research and Practice in Information Technology, 28 (A. Cockburn, Ed.).Google Scholar
Briand, K. A., & Klein, R. M. (1987). Is Posner’s ‘beam’ the same as Treisman’s ‘glue’?: On the relation between visual orienting and feature integration theory. Journal of Experimental Psychology: Human Perception and Performance, 13, 228–41.Google Scholar
Brown, I. D. (2001). A review of the ‘looked but failed to see’ accident causation factor. Behavioural Research in Road Safety. Eleventh Seminar, UK. Department for Transport. https://webarchive.nationalarchives.gov.uk/20100209094331/http://www.dft.gov.uk/print/pgr/roadsafety/research/behavioural/archive/behaviouralresearchinroadsaf4682%23a1169Google Scholar
Brozzoli, C., Cardinali, L., Pavani, F., & Farnè, A. (2010). Action-specific remapping of peripersonal space. Neuropsychologia, 48, 796802.Google Scholar
Brozzoli, C., Pavani, F., Cardinali, L., Urquizar, C., Cardinali, L., & Farnè, A. (2009). Grasping actions remap peripersonal space. Neuroreport, 20, 913–17.Google Scholar
Buchtel, H. A., & Butter, C. M. (1988). Spatial attention shifts: Implications for the role of polysensory mechanisms. Neuropsychologia, 26, 499509.CrossRefGoogle ScholarPubMed
Buschman, T. J., & Miller, E. K. (2007). Top-down versus bottom-up control of attention in the prefrontal and posterior parietal cortices. Science, 315, 1860–2.Google Scholar
Busse, L., Roberts, K. C., Crist, R. E., Weissman, D. H., & Woldorff, M. G. (2005). The spread of attention across modalities and space in a multisensory object. Proceedings of the National Academy of Sciences of the USA, 102, 18751–6.Google Scholar
Campus, C., Sandini, G., Morrone, M. C., & Gori, M. (2017). Spatial localization of sound elicits early responses from occipital visual cortex in humans. Scientific Reports, 7, 10415.Google Scholar
Cappe, C., Thelen, A., Romei, V., Thut, G., & Murray, M. (2012). Looming signals reveal synergistic principles of multisensory integration. Journal of Neuroscience, 32, 1171–82. http://doi.org/10.1523/JNEUROSCI.5517–11.2012CrossRefGoogle ScholarPubMed
Carlander, O., Eriksson, L., & Oskarsson, P.-A. (2007). Handling uni- and multimodal threat cuing with simultaneous radio calls in a combat vehicle setting. In Stephanidis, C. (Ed.), Proceedings of HCI International 2007 (HCI Part II, HCI 2007, LNCS 4555) (pp. 293302). Berlin, Germany: Springer-Verlag.Google Scholar
Carrasco, M. (2011). Visual attention: The past 25 years. Vision Research, 51: 14841525. http://doi.org/10.1016/j.visres.2011.04.012Google Scholar
Carskadon, M. A., & Herz, R. S. (2004). Minimal olfactory perception during sleep: Why odor alarms will not work for humans. Sleep, 27, 402–5.Google Scholar
Castro, L., Soto‐Faraco, S., Morís Fernández, L., & Ruzzoli, M. (2018). The breakdown of the Simon effect in cross‐modal contexts: EEG evidence. European Journal of Neuroscience, 47, 832–44.CrossRefGoogle ScholarPubMed
Chambers, C. D., Payne, J. M., & Mattingley, J. B. (2007). Parietal disruption impairs reflexive spatial attention within and between sensory modalities. Neuropsychologia, 45, 1715–24.Google Scholar
Chambers, C. D., Stokes, M. G., & Mattingley, J. B. (2004). Modality-specific control of strategic spatial attention in parietal cortex. Neuron, 44, 925–30.CrossRefGoogle ScholarPubMed
Chan, A., MacLean, K. E., & McGrenere, J. (2005). Learning and identifying haptic icons under workload. In Proceedings of the 1st World Haptics Conference (WHC ’05), Pisa, Italy, pp. 432–9.Google Scholar
Chan, A. H. S., & Chan, K. W. L. (2006). Synchronous and asynchronous presentations of auditory and visual signals: Implications for control console design. Applied Ergonomics, 37, 131–40.Google Scholar
Chen, K., Zhou, B., Chen, S., He, S., & Zhou, W. (2013). Olfaction spontaneously highlights visual saliency map. Proceedings of the Royal Society B. Biological Sciences, 280: 20131729.Google Scholar
Chen, X., Chen, Q. I., Gao, D., & Yue, Z. (2012). Interaction between endogenous and exogenous orienting in crossmodal attention. Scandinavian Journal of Psychology, 53, 303–8.Google Scholar
Chen, Y.-C., & Spence, C. (2013). The time-course of the cross-modal semantic modulation of visual picture processing by naturalistic sounds and spoken words. Multisensory Research, 26, 371–86.CrossRefGoogle ScholarPubMed
Chen, Y.-C., & Spence, C. (2017). Hemispheric asymmetry: A novel signature of attention’s role in multisensory integration. Psychonomic Bulletin & Review, 24, 690707.CrossRefGoogle Scholar
Chica, A. B., Bartolomeo, P., & Lupiáñez, J. (2013). Two cognitive and neural systems for endogenous and exogenous spatial attention. Behavioural Brain Research, 237, 107–23.Google Scholar
Chica, A., Sanabria, D., Lupiáñez, J., & Spence, C. (2007). Comparing intramodal and crossmodal cuing in the endogenous orienting of spatial attention. Experimental Brain Research, 179, 353–64, 531.CrossRefGoogle ScholarPubMed
Chiou, R., & Rich, A. N. (2012). Cross-modality correspondence between pitch and spatial location modulates attentional orienting. Perception, 41, 339–53.Google Scholar
Chun, M. M., Golomb, J. D., & Turk-Browne, N. B. (2011). A taxonomy of external and internal attention. Annual Review of Psychology, 62, 73101. http://doi.org/10.1146/annurev.psych.093008.100427CrossRefGoogle ScholarPubMed
Ciaramitaro, V. M., Chow, H. M., & Eglington, L. G. (2017). Cross-modal attention influences auditory contrast sensitivity: Decreasing visual load improves auditory thresholds for amplitude- and frequency-modulated sounds. Journal of Vision, 17, 20. http://doi.org/10.1167/17.3.20Google Scholar
Cléry, J., Guipponi, O., Odouard, S., Wardak, C., & Ben Hamed, S. (2015). Impact prediction by looming visual stimuli enhances tactile detection. Journal of Neuroscience, 35, 4179–89.Google Scholar
Conrad, V., Kleiner, M., Bartels, A., Hartcher-O’Brien, J., Bülthoff, H. H., & Noppeney, U. (2013). Naturalistic stimulus structure determines the integration of audiovisual looming signals in binocular rivalry. PLoS ONE, 8, e70710. http://doi.org/10.1371/journal.pone.0070710Google Scholar
Corbetta, M., & Shulman, G. L. (2002). Control of goal-directed and stimulus-driven attention in the brain. Nature Reviews Neuroscience, 3, 201–15.CrossRefGoogle ScholarPubMed
Dalton, P., Lavie, N., & Spence, C. (2009a). The role of working memory in tactile selective attention. Quarterly Journal of Experimental Psychology, 62, 635–44.Google Scholar
Dalton, P., Santangelo, V., & Spence, C. (2009b). The role of working memory in auditory selective attention. Quarterly Journal of Experimental Psychology, 62, 2126–32.Google ScholarPubMed
de Gelder, B., & Bertelson, P. (2003). Multisensory integration, perception and ecological validity. Trends in Cognitive Sciences, 7, 460–7.CrossRefGoogle ScholarPubMed
de Haan, B., Morgan, P. S., & Rorden, C. (2008). Covert orienting of attention and overt eye movements activate identical brain regions. Brain Research, 1204, 102–11. http://doi.org/10.1016/j.brainres.2008.01.105CrossRefGoogle ScholarPubMed
Deatherage, B. H. (1972). Auditory and other sensory forms of information processing. In Van Cott, H. P. & Kinkade, R. G. (Eds.), Human engineering guide to equipment design (pp. 124–60). New York, NY: John Wiley and Sons.Google Scholar
Dehais, F., Causse, M., Vachon, F., Régis, N., Menant, E., & Tremblay, S. (2014). Failure to detect critical auditory alerts in the cockpit: Evidence for inattentional deafness. Human Factors, 56, 631644. http://doi.org/10.1177/0018720813510735CrossRefGoogle ScholarPubMed
Dell’Acqua, R., Jolicoeur, P., Pesciarelli, F., Job, R., & Palomba, D. (2003). Electrophysiological evidence of visual encoding deficits in a cross-modal attentional blink paradigm. Psychophysiology, 40, 629–39.Google Scholar
Dell’Acqua, R., Jolicoeur, P., Sessa, P., & Turatto, M. (2006). Attentional blink and selection in the tactile domain. European Journal of Cognitive Psychology, 18, 537–59.Google Scholar
Dell’Acqua, R., Turatto, M., & Jolicoeur, P. (2001). Cross-modal attentional deficits in processing tactile stimulation. Perception & Psychophysics, 63, 777–89.Google Scholar
De Meo, R., Murray, M. M., Clarke, S., & Matusz, P. J. (2015). Top-down control and early multisensory processes: Chicken vs. egg. Frontiers in Integrative Neuroscience, 9, 17.CrossRefGoogle ScholarPubMed
di Pellegrino, G., Làdavas, E., & Farné, A. (1997). Seeing where your hands are. Nature, 388, 730.Google Scholar
Diederich, A., & Colonius, H. (2019). Multisensory integration and exogenous spatial attention: A time-window-of-integration analysis. Journal of Cognitive Neuroscience, 31, 699710.Google Scholar
Dittrich, K., Bossert, M. L., Rothe-Wulf, A., & Klauer, K. C. (2017). The joint flanker effect and the joint Simon effect: On the comparability of processes underlying joint compatibility effects. Quarterly Journal of Experimental Psychology, 70, 1808–23. http://doi.org/10.1080/17470218.2016.1207690CrossRefGoogle ScholarPubMed
Doruk, D., Chanes, L., Malavera, A., Merabet, L. B., Valero-Cabré, A., Fregni, F. (2018). Cross-modal cueing effects of visuospatial attention on conscious somatosensory perception. Heliyon, 4, e00595. http://doi.org/10.1016/j.heliyon.2018.e00595CrossRefGoogle ScholarPubMed
Driver, J. (1996). Enhancement of selective listening by illusory mislocation of speech sounds due to lip-reading. Nature, 381, 66–8.Google Scholar
Driver, J. (2001). A selective review of selective attention research from the past century. British Journal of Psychology, 92, 5378.CrossRefGoogle ScholarPubMed
Driver, J., & Spence, C. [J.] (1994). Spatial synergies between auditory and visual attention. In Umiltà, C. & Moscovitch, M. (Eds.), Attention and performance XV: Conscious and nonconcious information processing (pp. 311331). Cambridge, MA: MIT Press.Google Scholar
Driver, J., & Spence, C. (1998). Attention and the crossmodal construction of space. Trends in Cognitive Sciences, 2, 254–62.CrossRefGoogle ScholarPubMed
Driver, J., & Spence, C. (2004). Crossmodal spatial attention: Evidence from human performance. In Spence, C. & Driver, J. (Eds.), Crossmodal space and crossmodal attention (pp. 179220). Oxford, UK: Oxford University Press.Google Scholar
Dufour, A. (1999). Importance of attentional mechanisms in audiovisual links. Experimental Brain Research, 126, 215–22.Google Scholar
Dufour, A., & Touzalin, P. (2008). Improved sensitivity in the perihand space. Experimental Brain Research, 190, 91–8.Google Scholar
Duncan, J., Martens, S., & Ward, R. (1997). Restricted attentional capacity within but not between sensory modalities. Nature, 387, 808–10.CrossRefGoogle Scholar
Edworthy, J., & Hellier, E. (2006). Complex nonverbal auditory signals and speech warnings. In Wogalter, M. S. (Ed.), Handbook of warnings (pp. 199220). Mahwah, NJ: Lawrence Erlbaum.Google Scholar
Eimer, M. (1999). Can attention be directed to opposite directions in different modalities? An ERP study. Clinical Neurophysiology, 110, 1252–9.Google Scholar
Eimer, M. (2004). Electrophysiology of human crossmodal spatial attention. In Spence, C. & Driver, J. (Eds.), Crossmodal space and crossmodal attention (pp. 221–45). Oxford, UK: Oxford University Press.Google Scholar
Eimer, M., Cockburn, D., Smedley, B., & Driver, J. (2001). Cross-modal links in endogenous spatial attention are mediated by common external locations: Evidence from event-related brain potentials. Experimental Brain Research, 139, 398411.CrossRefGoogle ScholarPubMed
Eimer, M., & Driver, J. (2001). Crossmodal links in endogenous and exogenous spatial attention: Evidence from event-related brain potential studies. Neuroscience and Biobehavioral Reviews, 25, 497511.CrossRefGoogle ScholarPubMed
Eimer, M., Van Velzen, J., & Driver, J. (2004). ERP evidence for cross-modal audiovisual effects of endogenous spatial attention within hemifields. Journal of Cognitive Neuroscience, 16, 272–88.CrossRefGoogle ScholarPubMed
Engel, A. K., Maye, A., Kurthen, M., & König, P. (2013). Where’s the action? The pragmatic turn in cognitive science. Trends in Cognitive Sciences, 17, 202–9. http://doi.org/10.1016/j.tics.2013.03.006Google Scholar
Eramudugolla, R., Kamke, M., Soto-Faraco, S., & Mattingley, J. B. (2011). Perceptual load influences auditory space perception in the ventriloquist aftereffect. Cognition, 118, 6274.Google Scholar
Ernst, M. O. (2012). Optimal multisensory integration: Assumptions and limits. In Stein, B. E. (Ed.), The new handbook of multisensory perception (pp. 527–43). Cambridge, MA: MIT Press.Google Scholar
Ernst, M. O., & Banks, M. S. (2002). Humans integrate visual and haptic information in a statistically optimal fashion. Nature, 415(6870), 429433.Google Scholar
Ernst, M. O., & Bülthoff, H. H. (2004). Merging the senses into a robust percept. Trends in Cognitive Sciences, 8, 162–9.Google Scholar
Fairhall, S. L., & Macaluso, E. (2009). Spatial attention can modulate audiovisual integration at multiple cortical and subcortical sites. European Journal of Neuroscience, 29, 1247–57.Google Scholar
Farah, M. J., Wong, A. B., Monheit, M. A., & Morrow, L. A. (1989). Parietal lobe mechanisms of spatial attention: Modality-specific or supramodal? Neuropsychologia, 27, 461–70.CrossRefGoogle ScholarPubMed
Farnè, A., Demattè, M. L., & Làdavas, E. (2003). Beyond the window: Multisensory representation of peripersonal space across a transparent barrier. International Journal of Psychophysiology, 50, 5161.Google Scholar
Farnè, A., & Làdavas, E. (2002). Auditory peripersonal space in humans. Journal of Cognitive Neuroscience, 14, 1030–43.CrossRefGoogle ScholarPubMed
Feng, W., Störmer, V. S., Martinez, A., McDonald, J. J., & Hillyard, S. A. (2014). Sounds activate visual cortex and improve visual discrimination, Journal of Neuroscience, 34, 9817–24. http://doi.org/10.1523/JNEUROSCI.4869-13.2014Google Scholar
Feng, W., Störmer, V. S., Martinez, A., McDonald, J. J., & Hillyard, S. A. (2017). Involuntary orienting of attention to a sound desynchronizes the occipital alpha rhythm and improves visual perception. Neuroimage, 150, 318–28. http://doi.org/10.1016/j.neuroimage.2017.02.033Google Scholar
Ferris, T. K., & Sarter, N. B. (2008). Cross-modal links among vision, audition, and touch in complex environments. Human Factors, 50, 1726.CrossRefGoogle ScholarPubMed
Ferris, T. K., & Sarter, N. (2011). Continuously informing vibrotactile displays in support of attention management and multitasking in anesthesiology. Human Factors, 53, 600–11.CrossRefGoogle ScholarPubMed
Fiebelkorn, I. C., Foxe, J. J., Butler, J. S., & Molholm, S. (2011). Auditory facilitation of visual-target detection persists regardless of retinal eccentricity and despite wide audiovisual misalignments. Experimental Brain Research, 213, 167–74.Google Scholar
Fiebelkorn, I. C., Foxe, J. J., & Molholm, S. (2010). Dual mechanisms for the cross-sensory spread of attention: How much do learned associations matter? Cerebral Cortex, 20, 109–20.CrossRefGoogle ScholarPubMed
Fiebelkorn, I. C., Foxe, J. J., & Molholm, S. (2012). Attention and multisensory feature integration. In Stein, B. E. (Ed.), The new handbook of multisensory processing (pp. 383–94). Cambridge, MA: MIT Press.Google Scholar
Fitch, G. M., Kiefer, R. J., Hankey, J. M., & Kleiner, B. M. (2007). Toward developing an approach for alerting drivers to the direction of a crash threat. Human Factors, 49, 710–20.CrossRefGoogle Scholar
Fong, M. C. M., Hui, N. Y., Fung, E. S. W., Chu, P. C. K., & Wang, W. S. Y. (2018). Conflict monitoring in multi-sensory flanker tasks: Effects of cross-modal distractors on the N2 component. Neuroscience Letters, 670, 31–5.Google Scholar
Forster, S., & Spence, C. (2018). ‘What smell?’ Temporarily loading visual attention induces a prolonged loss of olfactory awareness. Psychological Science, 29, 1642–52.Google Scholar
Fumento, M. (1998). ‘Road rage’ versus reality. Atlantic Monthly, 282, 1217.Google Scholar
Gallace, A., Soto-Faraco, S., Dalton, P., Kreukniet, B., & Spence, C. (2008). Response requirements modulate tactile spatial congruency effects. Experimental Brain Research, 191, 171–86.Google Scholar
Gallace, A., & Spence, C. (2014). In touch with the future: The sense of touch from cognitive neuroscience to virtual reality. Oxford: Oxford University Press.Google Scholar
Galli, G., Noel, J.-P., Canzoneri, E., Blanke, O., & Serino, A. (2015). The wheelchair as a full-body tool extending the peripersonal space. Frontiers in Psychology, 6, 639. http://doi.org/10.3389/fpsyg.2015.00639Google Scholar
Gaver, W. W. (1986). Auditory icons: Using sound in computer interfaces. Human-Computer Interaction, 2, 167–77.Google Scholar
Geissler, L. R. (1909). The measurement of attention. The American Journal of Psychology, 20, 473529. http://doi.org/10.2307/1412972Google Scholar
Getz, L. M., & Kubovy, M. (2018). Questioning the automaticity of audiovisual correspondences. Cognition, 175, 101108.Google Scholar
Gibney, K. D., Aligbe, E., Eggleston, B. A., Nunes, S. R., Kerkhoff, W. G., Dean, C. L., & Kwakye, L. D. (2017). Visual distractors disrupt audiovisual integration regardless of stimulus complexity. Frontiers of Integrative Neuroscience, 11, 1. http://doi.org/10.3389/fnint.2017.00001Google Scholar
Gibson, J. J., & Crooks, L. E. (1938). A theoretical field-analysis of automobile-driving. American Journal of Psychology, 51, 453–71.Google Scholar
Gondan, M., Lange, K., Rösler, F., & Röder, B. (2004). The redundant target effect is affected by modality switch costs. Psychonomic Bulletin and Review, 11, 307–13.CrossRefGoogle ScholarPubMed
Graham, R. (1999). Use of auditory icons as emergency warnings: Evaluation within a vehicle collision avoidance application. Ergonomics, 42, 1233–48.Google Scholar
Gray, R., Mohebbi, R., & Tan, H. Z. (2009). The spatial resolution of crossmodal attention: Implications for the design of multimodal interfaces. ACM Transactions on Applied Perception, 6, 114.CrossRefGoogle Scholar
Graziano, M. S. A., & Gross, C. G. (1995). The representation of extrapersonal space: A possible role for bimodal, visual-tactile neurons. In Gazzaniga, M. S. (Ed.), The cognitive neurosciences (p. 10211034). The MIT Press.Google Scholar
Graziano, M. S. A., Gross, C. G., Taylor, C. S. R., & Moore, T. (2004). A system of multimodal areas in the primate brain. In Spence, C. & Driver, J. (Eds.), Crossmodal space and crossmodal attention (pp. 5167). Oxford, UK: Oxford University Press.CrossRefGoogle Scholar
Green, J. J., Doesburg, S. M., Ward, L. W., & McDonald, J. J. (2011). Electrical neuroimaging of voluntary audiospatial attention: Evidence for a supramodal attention control network. Journal of Neuroscience, 31, 3560–4.Google Scholar
Green, J. J., Teder-Sälejärvi, W. A., & McDonald, J. J. (2005). Control mechanisms mediating shifts of attention in auditory and visual space: A spatio-temporal ERP analysis. Experimental Brain Research, 166, 358–69.Google Scholar
Groh, J. M., & Sparks, D. L. (1996). Saccades to somatosensory targets: 1. Behavioral characteristics. Journal of Neurophysiology, 75, 412–27.Google Scholar
Gross, C. G., & Graziano, M. S. A. (1995). Multiple representations of space in the brain. The Neuroscientist, 1, 4350.CrossRefGoogle Scholar
Gruenefeld, U., Löchen, A., Brueck, Y., Boll, S., & Heuten, W. (2018). Where to look: Exploring peripheral cues for shifting attention to spatially distributed out-of-view objects. Automotive UI’18, September 23–25, 2018, Toronto, ON, Canada. http://doi.org/10.1145/3239060.3239080CrossRefGoogle Scholar
Hairston, W. D., Hodges, D. A., Casanova, R., Hayasaka, S., Kraft, R., Maldjian, J. A., & Burdette, J. H. (2008). Closing the mind’s eye: Deactivation of visual cortex related to auditory task difficulty. Neuroreport, 19, 151–4.Google Scholar
Hancock, P. A., Lawson, B. D., Cholewiak, R., Elliott, L. R., van Erp, J. B. F., Mortimer, B. J. P., Rupert, A., & Redden, E. S. (2015). Tactile cuing to augment multisensory human–machine interaction. Ergonomics in Design, 23, 29.Google Scholar
Hancock, P. A., Mercado, J. E., Merlo, J., & Van Erp, J. B F. (2013). Improving target detection in visual search through the augmenting multi-sensory cues. Ergonomics, 56, 729–38.Google Scholar
Hancock, P. A., Oron-Gilad, T., & Szalma, J. L. (2007). Elaborations of the multiple-resource theory of attention. In Kramer, A. F., Wiegmann, D. A., & Kirlik, A. (Eds.), Attention: From theory to practice (pp. 4556). Oxford, UK: Oxford University Press.Google Scholar
Hari, R., & Jousmäki, V. (1996). Preference of personal to extrapersonal space in a visuomotor task. Journal of Cognitive Neuroscience, 8, 305–7.CrossRefGoogle Scholar
Harmening, W. M., Orlowski, J., Ben-Shahar, O., & Wagner, H. (2011). Overt attention toward oriented objects in free-viewing barn owls. Proceedings of the National Academy of Sciences of the USA, 108, 8461–6. http://doi.org/10.1073/pnas.1101582108Google Scholar
Hartcher-O’Brien, J., Soto-Faraco, S., & Adam, R. (2017). A matter of bottom-up or top-down processes: The role of attention in multisensory integration. Frontiers in Integrative Neuroscience, 11, 5.Google ScholarPubMed
Hasson, U., Malach, R., & Heeger, D. J. (2010). Reliability of cortical activity during natural stimulation. Trends in Cognitive Sciences, 14, 40–8. http://doi.org/10.1016/J.TICS.2009.10.011CrossRefGoogle ScholarPubMed
Hediger, H. (1955). Studies of the psychology and behaviour of captive animals in zoos and circuses. New York, NY: Criterion Books.Google Scholar
Heed, T., Habets, B., Sebanz, N., & Knoblich, G. (2010). Others’ actions reduce crossmodal integration in peripersonal space. Current Biology, 20, 1345–9.Google Scholar
Heffner, R. S., & Heffner, H. E. (1992a). Evolution of sound localization in mammals. In Webster, D. B., Fay, R. R., & Popper, A. N. (Eds.), The evolutionary biology of hearing (pp. 691715). New York, NY: Springer-Verlag.Google Scholar
Heffner, R. S., & Heffner, H. E. (1992b). Visual factors in sound localization in mammals. Journal of Comparative Neurology, 317, 219–32.Google Scholar
Hein, G., Parr, A., & Duncan, J. (2006). Within-modality and cross-modality attentional blinks in a simple discrimination task. Perception & Psychophysics, 68, 5461.Google Scholar
Helbig, H. B., & Ernst, M. O. (2008). Visual-haptic cue weighting is independent of modality-specific attention. Journal of Vision, 8, 21, 116.CrossRefGoogle ScholarPubMed
Hillstrom, A. P., Shapiro, K., & Spence, C. (2002). Attentional and perceptual limitations in processing sequentially presented vibrotactile targets. Perception & Psychophysics, 64, 1068–82.Google Scholar
Hillyard, S. A., Simpson, G. V., Woods, D. L., Van Voorhis, S., & Münte, T. F. (1984). Event-related brain potentials and selective attention to different modalities. In Reinoso-Suarez, F. & Ajmone-Marson, C. (Eds.), Cortical integration (pp. 395414). New York, NY: Raven Press.Google Scholar
Hillyard, S. A., Störmer, V. S., Feng, W., Martinez, A., & McDonald, J. J. (2016). Cross-modal orienting of visual attention, Neuropsychologia, 83, 170–8.Google Scholar
Hirst, R. J., Cragg, L., & Allen, H. A. (2018). Vision dominates in adults but not children: A meta-analysis of the Colavita effect. Neuroscience & Biobehavioural Reviews, 94, 286301.CrossRefGoogle Scholar
Ho, C., Hollingworth, C., Hollingworth, C., & Spence, C. (2015). Assessing the potential benefits of novel front and back bike symbol lights to improve drivers’ awareness of cyclists on road. Unpublished manuscript.Google Scholar
Ho, C., Reed, N., & Spence, C. (2007). Multisensory in-car warning signals for collision avoidance. Human Factors, 49, 11071114.Google Scholar
Ho, C., Santangelo, V., & Spence, C. (2009). Multisensory warning signals: When spatial correspondence matters. Experimental Brain Research, 195, 261–72.CrossRefGoogle ScholarPubMed
Ho, C., & Spence, C. (2005a). Olfactory facilitation of dual-task performance. Neuroscience Letters, 389, 3540.Google Scholar
Ho, C., & Spence, C. (2005b). Assessing the effectiveness of various auditory cues in capturing a driver’s visual attention. Journal of Experimental Psychology: Applied, 11, 157–74.Google Scholar
Ho, C., & Spence, C. (2006). Verbal interface design: Do verbal directional cues automatically orient visual spatial attention? Computers in Human Behavior, 22, 733–48.Google Scholar
Ho, C., & Spence, C. (2007). Head orientation biases tactile localization. Brain Research, 1144C, 136–41.Google Scholar
Ho, C., & Spence, C. (2008). The multisensory driver: Implications for ergonomic car interface design. Aldershot, HA: Ashgate Publishing.Google Scholar
Ho, C., & Spence, C. (2009). Using peripersonal warning signals to orient a driver’s gaze. Human Factors, 51, 539–56.Google Scholar
Ho, C., & Spence, C. (2013). Affective multisensory driver interface design. International Journal of Vehicle Noise and Vibration (Special Issue on Human Emotional Responses to Sound and Vibration in Automobiles), 9, 6174. http://doi.org/10.1504/IJVNV.2013.053817Google Scholar
Ho, C., & Spence, C. (2014). Effectively responding to tactile stimulation: Do homologous cue and effector locations really matter? Acta Psychologia, 151, 32–9.Google Scholar
Ho, C., Spence, C., & Gray, R. (2013). Looming auditory and vibrotactile collision warnings for safe driving. Proceedings of the 7th International Driving Symposium on Human Factors in Driver Assessment, Training, and Vehicle Design, 551–7.Google Scholar
Ho, C., Spence, C., & Tan, H. Z. (2005). Warning signals go multisensory. Proceedings of HCI International 2005, 9, Paper No. 2284, 110.Google Scholar
Ho, C., Tan, H. Z., & Spence, C. (2005). Using spatial vibrotactile cues to direct visual attention in driving scenes. Transportation Research Part F, 8, 397412.Google Scholar
Ho, C., Tan, H. Z., & Spence, C. (2006). The differential effect of vibrotactile and auditory cues on visual spatial attention. Ergonomics, 49, 724–38.Google Scholar
Holmes, N. P., & Spence, C. (2005). Beyond the body schema: Visual, prosthetic, and technological contributions to bodily perception and awareness. In Knoblich, G., Thornton, I. M., Grosjean, M., & Shiffrar, M. (Eds.), Human body perception from the inside out (pp. 1564). Oxford, UK: Oxford University Press.Google Scholar
Hopkins, K., Kass, S. J., Blalock, L. D., & Brill, J. C. (2017). Effectiveness of auditory and tactile crossmodal cues in a dual-task visual and auditory scenario. Ergonomics, 60, 692700. http://doi.org/10.1080/00140139.2016.1198495CrossRefGoogle Scholar
Horowitz, M. J., Duff, D. F., & Stratton, L. O. (1964). Body-buffer zone: Exploration of personal space. Archive of General Psychiatry, 11, 651–6.CrossRefGoogle ScholarPubMed
Horrey, W. J., & Wickens, C. D. (2006). Examining the impact of cell phone conversations on driving using meta-analytic techniques. Human Factors, 48, 196205.CrossRefGoogle ScholarPubMed
Houghton, R. J., Macken, W. J., & Jones, D. M. (2003). Attentional modulation of the visual motion aftereffect has a central cognitive locus: Evidence of interference by the postcategorical on the precategorical. Journal of Experimental Psychology: Human Perception & Performance, 29, 731–40.Google Scholar
Hugenschmidt, C. E., Peiffer, A. M., McCoy, T. P., Hayasaka, S., & Laurienti, P. J. (2009). Preservation of crossmodal selective attention in healthy aging. Experimental Brain Research, 198, 273–85.Google Scholar
Hyde, P. S., & Knudsen, E. I. (2002). The optic tectum controls visually guided adaptive plasticity in the owl’s auditory space map. Nature, 415, 73–6.Google Scholar
Hylan, J. P. (1903). The distribution of attention - I. Psychological Review, 10, 373403.CrossRefGoogle Scholar
Ignashchenkova, A., Dicke, P. W., Haarmeier, T., & Their, P. (2004). Neuron-specific contribution of the superior colliculus to overt and covert shifts of attention. Nature Neuroscience, 7, 5664.CrossRefGoogle ScholarPubMed
Iordanescu, L., Grabowecky, M., Franconeri, S., Theeuwes, J., & Suzuki, S. (2010). Characteristic sounds make you look at target objects more quickly. Attention, Perception, and Psychophysics, 72, 1736–41.Google Scholar
Iordanescu, L., Grabowecky, M., & Suzuki, S. (2011). Object-based auditory facilitation of visual search for pictures and words with frequent and rare targets Acta Psychologica, 137, 252–9. http://doi.org/10.1016/j.actpsy.2010.07.017Google Scholar
Iordanescu, L., Guzman-Martinez, E., Grabowecky, M., & Suzuki, S. (2008). Characteristic sound facilitates visual search. Psychonomic Bulletin & Review, 15, 548–54.CrossRefGoogle ScholarPubMed
Jack, B. N., O’Shea, R. P., Cottrell, D., & Ritter, W. (2013). Does the ventriloquist illusion assist selective listening? Journal of Experimental Psychology: Human Perception & Performance, 39, 1496–502.Google Scholar
Jacoby, O., Hall, S. E., & Mattingley, J. B. (2012). A crossmodal crossover: Opposite effects of visual and auditory perceptual load on steady-state evoked potentials to irrelevant visual stimuli. Neuroimage, 61, 1050–8.CrossRefGoogle ScholarPubMed
James, W. (1890). The principles of psychology (2 Vols.). New York, NY: Henry Holt.CrossRefGoogle Scholar
Jensen, A., Merz, S., Spence, C., & Frings, C. (2019). Overt spatial attention modulates multisensory selection. Journal of Experimental Psychology: Human Perception & Performance, 45, 174–88.Google ScholarPubMed
Jensen, A., Merz, S., Spence, C., & Frings, C. (2020). Interference of irrelevant information in multisensory selection depends on attentional set. Attention, Perception, & Psychophysics, 82, 1176–1195. http://doi.org/10.3758/s13414-019-01848-8Google Scholar
Johnen, A., Wagner, H., & Gaese, B. H. (2001). Spatial attention modulates sound localization in barn owls. Journal of Physiology, 85, 1009–12.Google Scholar
Johnson, J. A., Strafella, A. P., & Zatorre, R. J. (2007). The role of the dorsolateral prefrontal cortex in bimodal divided attention: Two transcranial magnetic stimulation studies. Journal of Cognitive Neuroscience, 19, 907–20.CrossRefGoogle ScholarPubMed
Johnson, J. A., & Zatorre, R. J. (2006). Neural substrates for dividing and focusing attention between simultaneous auditory and visual events. Neuroimage, 31, 1673–81.Google Scholar
Jolicoeur, P. (1999). Restricted attentional capacity between sensory modalities. Psychonomic Bulletin & Review, 6, 8792.Google Scholar
Jonides, J. (1981). Voluntary versus automatic control over the mind’s eye’s movement. In Long, J. & Baddeley, A. (Eds.), Attention and performance (Vol. 9, pp. 187203). Hillsdale, NJ: Erlbaum.Google Scholar
Juan, C., Cappe, C., Alric, B., Roby, B., Gilardeau, S., Barone, P., & Girard, P. (2017). The variability of multisensory processes of natural stimuli in human and non-human primates in a detection task. PLoS ONE, 12, e0172480.Google Scholar
Julesz, B., & Hirsh, I. J. (1972). Visual and auditory perception – An essay of comparison. In David, E. E. Jr. & Denes, P. B. (Eds.), Human communication: A unified view (pp. 283340). New York, NY: McGraw-Hill.Google Scholar
Juravle, G., Binstead, G., & Spence, C. (2017). Tactile suppression in goal-directed movement. Psychonomic Bulletin & Review, 24, 1060–76.CrossRefGoogle ScholarPubMed
Just, M. A., Kellar, T. A., & Cynkar, J. (2008). A decrease in brain activation associated with driving when listening to someone speak. Brain Research, 1205, 7080.CrossRefGoogle ScholarPubMed
Kahneman, D., & Treisman, A. (1984). Changing views of attention and automaticity. In Parasuraman, R. & Davies, D. R. (Eds.), Varieties of attention (pp. 2661). San Diego, CA: Academic Press.Google Scholar
Kasper, R. W., Cecotti, H., Touryan, J., Eckstein, M. P., & Giesbrecht, B. (2014). Isolating the neural mechanisms of interference during continuous multisensory dual-task performance. Journal of Cognitive Neuroscience, 26, 476–89.Google Scholar
Kawashima, R., O’Sullivan, B. T., & Roland, P. E. (1995). Positron-emission tomography studies of cross-modality inhibition in selective attentional tasks: Closing the ‘mind’s eye’. Proceedings of the National Academy of Science, USA, 92, 5969–72.Google Scholar
Keil, A., Bradley, M. M., Junghöfer, M., Russmann, T., Lowenthal, W., & Lang, P. J. (2007). Cross-modal attention capture by affective stimuli: Evidence from event-related potentials. Cognitive, Affective, & Behavioral Neuroscience, 7, 1824.CrossRefGoogle ScholarPubMed
Keil, J., Pomper, U., & Senkowski, D. (2016). Distinct patterns of local oscillatory activity and functional connectivity underlie intersensory attention and temporal prediction. Cortex, 74, 277–88. http://doi.org/10.1016/j.cortex.2015.10.023CrossRefGoogle ScholarPubMed
Kennett, S., Eimer, M., Spence, C., & Driver, J. (2001). Tactile-visual links in exogenous spatial attention under different postures: Convergent evidence from psychophysics and ERPs. Journal of Cognitive Neuroscience, 13, 462–78.Google Scholar
Kennett, S., Spence, C., & Driver, J. (2002). Visuo-tactile links in covert exogenous spatial attention remap across changes in unseen hand posture. Perception & Psychophysics, 64, 1083–94.Google Scholar
Kida, T., Inui, K., Wasaka, T., Akatsuka, K., Tanaka, E., & Kakigi, R. (2007). Time-varying cortical activations related to visual-tactile cross-modal links in spatial selective attention. Journal of Neurophysiology, 97, 3585–96.CrossRefGoogle ScholarPubMed
Kitagawa, N., & Spence, C. (2005). Investigating the effect of a transparent barrier on the crossmodal congruency effect. Experimental Brain Research, 161, 6271.Google Scholar
Kitagawa, N., Zampini, M., & Spence, C. (2005). Audiotactile interactions in near and far space. Experimental Brain Research, 166, 528–37.Google Scholar
Klapetek, A., Ngo, M. K., & Spence, C. (2012). Do crossmodal correspondences enhance the facilitatory effect of auditory cues on visual search? Attention, Perception, & Psychophysics, 74, 1154–67.CrossRefGoogle ScholarPubMed
Klein, R. (2000). Inhibition of return. Trends in Cognitive Sciences, 4, 138–47.CrossRefGoogle ScholarPubMed
Klein, R. M., Brennan, M., & Gilani, A. (1987, November). Covert cross-modality orienting of attention in space. Paper presented at the annual meeting of the Psychonomics Society, Seattle.Google Scholar
Knoeferle, K., Knoeferle, P., Velasco, C., & Spence, C. (2016). Multisensory brand search: How the meaning of sounds guides consumers’ visual attention. Journal of Experimental Psychology: Applied, 22, 196210.Google Scholar
Knudsen, E. I. (1982). Auditory and visual maps of space in the optic tectum of the owl. Journal of Neuroscience, 2, 1177–94.CrossRefGoogle ScholarPubMed
Kóbor, I., Füredi, L., Kovács, G., Spence, C., & Vidnyánszky, Z. (2006). Back-to-front: Improved tactile discrimination performance in the space you can’t see. Neuroscience Letters, 400, 163–7.Google Scholar
Koelewijn, T., Bronkhorst, A., & Theeuwes, J. (2009a). Competition between auditory and visual spatial cues during visual task performance. Experimental Brain Research, 195, 593602.Google Scholar
Koelewijn, T., Bronkhorst, A., & Theeuwes, J. (2009b). Auditory and visual capture during focused visual attention. Journal of Experimental Psychology: Human Perception & Performance, 35, 1303–15.Google Scholar
Koelewijn, T., Bronkhorst, A., & Theeuwes, J. (2010). Attention and the multiple stages of multisensory integration: A review of audiovisual studies. Acta Psychologica, 134, 372–84.Google Scholar
Koster, E. H., Crombez, G., Van Damme, S., Verschuere, B., & De Houwer, J. (2004). Does imminent threat capture or hold attention? Emotion, 4, 312–17.Google Scholar
Kreutzfeldt, M., Stephan, D. N., Sturm, W., Willems, K., & Koch, I. (2015). The role of crossmodal competition and dimensional overlap in crossmodal attention switching. Acta Psychologica, 155, 6776.Google Scholar
Kunar, M. A., Carter, R., Cohen, M., & Horowitz, T. S. (2008). Telephone conversation impairs sustained visual attention via a central bottleneck. Psychonomic Bulletin & Review, 15, 1135–40.CrossRefGoogle Scholar
Kustov, A. A., & Robinson, D. L. (1996). Shared neural control of attentional shifts and eye movements. Nature, 384, 74–7.Google Scholar
Kvasova, D., Garcia-Vernet, L., & Soto-Faraco, S. (2019). Characteristic sounds facilitate object search in real-life scenes. BioRxiv, 563080.CrossRefGoogle Scholar
Laeng, B., Brennen, T., Johannessen, K., Holmen, K., & Elvestad, R. (2002). Multiple reference frames in neglect? An investigation of the object-centred frame and the dissociation between ‘near’ and ‘far’ from the body by use of a mirror. Cortex, 38, 511–28.CrossRefGoogle ScholarPubMed
Lakatos, P., Chen, C. M., O’Connell, M. N., Mills, A., & Schroeder, C. E. (2007). Neuronal oscillations and multisensory interaction in primary auditory cortex. Neuron, 53, 279–92.Google Scholar
Lange, K., & Röder, B. (2006). Orienting attention to points in time improves stimulus processing both within and across modalities. Journal of Cognitive Neuroscience, 18, 715–29.Google Scholar
Lavie, N. (1995). Perceptual load as a necessary condition for selective attention. Journal of Experimental Psychology: Human Perception & Performance, 21, 451–68.Google Scholar
Lavie, N. (2005). Distracted and confused? Selective attention under load. Trends in Cognitive Sciences, 9, 7582.Google Scholar
Lavie, N. (2010). Attention, distraction, and cognitive control under load. Current Directions in Psychological Science, 19, 143–8.Google Scholar
Lee, J., & Spence, C. (2015). Audiovisual crossmodal cuing effects in front and rear space. Frontiers in Psychology: Cognitive Science, 6, 1086.Google Scholar
Lee, J., & Spence, C. (2017). On the spatial specificity of audiovisual crossmodal exogenous cuing effects. Acta Psychologica, 177, 7888.Google Scholar
Lee, J., & Spence, C. (2018). Assessing the influence of sound parameters on crossmodal cuing in different regions of space. Acta Psychologica, 185, 96103.Google Scholar
Lee, J. D., Hoffman, J. D., & Hayes, E. (2004). Collision warning design to mitigate driver distraction. CHI 2004 (24–29 April, Vienna), 6, 6572.Google Scholar
Lennie, P. (2003). The cost of cortical computation. Current Biology, 13, 493–7.Google Scholar
Leo, F., Romei, V., Freeman, E., Ladavas, E., & Driver, J. (2011). Looming sounds enhance orientation sensitivity for visual stimuli on the same side as such sounds. Experimental Brain Research, 213, 193201.Google Scholar
Leone, L. M., & McCourt, M. E. (2013). The role of physical and physiological simultaneity in audiovisual multisensory facilitation. i-Perception, 4, 213–28.Google Scholar
Levy, J., Pashler, H., & Boer, E. (2006). Central interference in driving: Is there any stopping the psychological refractory period? Psychological Science, 17, 228–35.Google Scholar
Li, C., Chen, K., Han, H., Chui, D., & Wu, J. (2012). An fMRI study of the neural systems involved in visually cued auditory top-down spatial and temporal attention. PLoS ONE, 7, e49948.Google Scholar
Lin, C. Y., & Hsu, C. C. (2010). Measurement of auditory cues in drivers’ distraction. Perceptual and Motor Skills, 111, 503–16.Google Scholar
Lloyd, D. M., Merat, N., McGlone, F., & Spence, C. (2003a). Crossmodal links between audition and touch in covert endogenous spatial attention. Perception & Psychophysics, 65, 901–24.Google Scholar
Lloyd, D. M., Shore, D. I., Spence, C., & Calvert, G. A. (2003b). Multisensory representation of limb position in human premotor cortex. Nature Neuroscience, 6, 1718.Google Scholar
Lovejoy, L. P., & Krauzlis, R. J. (2010). Inactivation of primate superior colliculus impairs covert selection of signals for perceptual judgments. Nature Neuroscience, 13, 261–6.Google Scholar
Lu, S. A., Wickens, C. D., Prinet, J. C., Hutchins, S. D., Sarter, N., & Sebok., A. (2013). Supporting interruption management and multimodal interface design: Three meta-analyses of task performance as a function of interrupting task modality. Human Factors, 55, 697724.Google Scholar
Lu, S., Wickens, C., Sarter, N., & Sebok, A. (2011). Informing the design of multimodal displays: A meta-analysis of empirical studies comparing auditory and tactile interruptions. In Proceedings of the 55th Annual Meeting of the Human Factors and Ergonomics Society (HFES). Las Vegas, NV, September (pp. 142–4).Google Scholar
Lu, Z.-L., & Dosher, B. A. (1998). External noise distinguishes attention mechanisms. Vision Research, 38, 1183–98.Google Scholar
Lu, Z.-L., & Dosher, B. A. (2000). Spatial attention: Different mechanisms for central and peripheral temporal precues? Journal of Experimental Psychology: Human Perception and Performance, 26, 1534–48.Google Scholar
Lu, Z.-L., Tse, H. C., Dosher, B. A., Lesmes, L. A., Posner, C., & Chu, W. (2009). Intra- and crossmodal cuing of spatial attention: Time courses and mechanisms. Vision Research, 49, 1081–96.Google Scholar
Lucas, P. A. (1995). An evaluation of the communicative ability of auditory icons and earcons. In Kramer, G., (Ed.), Proceedings of the 2nd International Conference on Auditory Display ICAD ’94 (pp. 121–8). Sante Fe: ICAD.Google Scholar
Lukas, S., Philipp, A. M., & Koch, I. (2010). Switching attention between modalities: Further evidence for visual dominance. Psychological Research, 74, 255–67.CrossRefGoogle ScholarPubMed
Lukas, S., Philipp, A. M., & Koch, I. (2014). Crossmodal attention switching: Auditory dominance in temporal discrimination tasks. Acta Psychologica, 153, 139–46.Google Scholar
Lunn, J., Sjoblom, A., Ward, J., Soto-Faraco, S., & Forster, S. (2019). Multisensory enhancement of attention depends on whether you are already paying attention. Cognition, 187, 3849.Google Scholar
Lupiáñez, J., Milán, E. G., Tornay, F. J., Madrid, E., & Tudela, P. (1998). Does IOR occur in discrimination tasks?: Yes, it does, but later. Perception & Psychophysics, 59, 1241–54.Google Scholar
Macaluso, E., Frith, C. D., & Driver, J. (2000). Modulation of human visual cortex by crossmodal spatial attention. Science, 289, 1206–8.CrossRefGoogle ScholarPubMed
Macaluso, E., Frith, C. D., & Driver, J. (2002a). Directing attention to locations and to sensory modalities: Multiple levels of selective processing revealed with PET. Cerebral Cortex, 12, 357–68.Google Scholar
Macaluso, E., Frith, C. D., & Driver, J. (2002b). Supramodal effects of covert spatial orienting triggered by visual or tactile events. Journal of Cognitive Neuroscience, 14, 389401.Google Scholar
Macaluso, E., Noppeney, U., Talsma, D., Vercillo, T., Hartcher-O’Brien, J., & Adam, R. (2016). The curious incident of attention in multisensory integration: Bottom-up vs. top-down. Multisensory Research, 29, 557–83.Google Scholar
MacDonald, J. S. P., & Lavie, N. (2011). Visual perceptual load induces inattentional deafness. Attention, Perception, & Psychophysics, 73, 1780–9.Google Scholar
Mack, A., & Rock, I. (1998). Inattentional blindness. Cambridge, MA: MIT Press.Google Scholar
Maddox, R. K., Pospisil, D. A., Stecker, G. C., & Lee, A. K. (2014). Directing eye gaze enhances auditory spatial cue discrimination. Current Biology, 24, 748–52. http://doi.org/10.1016/j.cub.2014.02.021Google Scholar
Maguire, E. A. (2012). Studying the freely-behaving brain with fMRI. NeuroImage, 62, 1170–6.Google Scholar
Marks, L. E., & Wheeler, M. E. (1998). Attention and the detectability of weak taste stimuli. Chemical Senses, 23, 1929.Google Scholar
Martin, G. N., & Cooper, J. A. (2007). Odour effects on simulated driving performance: Adding zest to difficult journeys. Poster presented at the British Psychology Society Annual Conference. York, 2123 March.Google Scholar
Mast, F., Frings, C., & Spence, C. (2017). Crossmodal attentional control sets between vision and audition. Acta Psychologica, 178, 41–7.Google Scholar
Matusz, P. J., & Eimer, M. (2011). Multisensory enhancement of attentional capture in visual search. Psychonomic Bulletin & Review, 18, 904–9.Google Scholar
Matusz, P. J., & Eimer, M. (2013). Top-down control of audiovisual search by bimodal search templates. Psychophysiology, 50, 9961009.Google Scholar
Matusz, P. J., Retsa, C., & Murray, M. M. (2016). The context-contingent nature of cross-modal activations of the visual cortex. Neuroimage, 125, 9961004. http://doi.org/10.1016/j.neuroimage.2015.11.016Google Scholar
Mazza, V., Turatto, M., Rossi, M., & Umiltà, C. (2007). How automatic are audiovisual links in exogenous spatial attention? Neuropsychologia, 45, 514–22.Google Scholar
McDonald, J. J., Green, J. J., Störmer, V. S., & Hillyard, S. A. (2012). Cross-modal spatial cueing of attention influences visual perception. In Murray, M. M. & Wallace, M. T. (Eds.), The neural bases of multisensory processes (pp. 509–28). Boca Raton, FL: CRC Press.Google Scholar
McDonald, J. J., Störmer, V. S., Martinez, A., Feng, W., & Hillyard, S. A. (2013) Salient sounds activate human visual cortex automatically. Journal of Neuroscience, 33, 9194–201. http://doi.org/10.1523/JNEUROSCI.5902-12.2013Google Scholar
McDonald, J. J., Teder-Sälejärvi, W. A., & Hillyard, S. A. (2000). Involuntary orienting to sound improves visual perception. Nature, 407, 906–8.CrossRefGoogle ScholarPubMed
McDonald, J. J., Teder-Sälejärvi, W. A., Di Russo, F., & Hillyard, S. A. (2005). Neural basis of auditory-induced shifts in visual time-order perception. Nature Neuroscience, 8, 1197–202.Google Scholar
McDonald, J. J., Teder-Sälejärvi, W. A., & Ward, L. M. (2001). Multisensory integration and crossmodal attention effects in the human brain. Science, 292, 1791.Google Scholar
McDonald, J. J., Whitman, C. J., Störmer, V. S., & Hillyard, S. A. (2014). Involuntary cross-modal spatial attention influences visual perception. In Mangun, G. R. (Ed.), Cognitive electrophysiology: Attention, signals mind (pp. 8294). New York, NY: Elsevier.CrossRefGoogle Scholar
McEvoy, S. P., Stevenson, M. R., & Woodward, M. (2007a). The prevalence of, and factors associated with, serious crashes involving a distracting activity. Accident Analysis & Prevention, 39, 475–82.Google Scholar
McEvoy, S. P., Stevenson, M. R., & Woodward, M. (2007b). The contribution of passengers versus mobile phone use to motor vehicle crashes resulting in hospital attendance by the driver. Accident Analysis & Prevention, 39, 1170–6.CrossRefGoogle ScholarPubMed
McKeown, J. D., & Isherwood, S. (2007). Mapping the urgency and pleasantness of speech, auditory icons, and abstract alarms to their referents within the vehicle. Human Factors, 49, 417–28.Google Scholar
Meng, F., Ho, C., Gray, R., & Spence., C. (2015a). Dynamic vibrotactile signals for forward collision avoidance warning systems. Human Factors, 57, 329–46.Google Scholar
Meng, F., Ho, C., Gray, R., & Spence., C. (2015b). Dynamic vibrotactile signals for forward collision avoidance: Toward the torso vs. toward the head. Ergonomics, 58, 411–25.Google Scholar
Meng, F., & Spence, C. (2015). Tactile warning signals for collision avoidance. Accident Analysis & Prevention, 75, 333–46.Google Scholar
Mercier, M. R., Foxe, J. J., Fiebelkorn, I. C., Butler, J. S., Schwartz, T. H., & Molholm, S. (2013). Auditory-driven phase reset in visual cortex: Human electrocorticography reveals mechanisms of early multisensory integration. Neuroimage, 79, 1929.Google Scholar
Merlo, J. L., Duley, A. R., & Hancock., P. A. (2010). Cross-modal congruency benefits for combined tactile and visual signaling. American Journal of Psychology, 123, 413–24.Google Scholar
Merlo, J., & Hancock, P. (2011). Quantification of tactile cueing for enhanced target search capacity. Military Psychology, 23, 137–53.Google Scholar
Michael, G. A., Dupuy, M.-A., Deleuze, A., Humblot, M., Simon, B., & Naveteur, J. (2012). Interacting effects of vision and attention in perceiving spontaneous sensations arising on the hands. Experimental Brain Research, 216, 2134.Google Scholar
Miles, E., Brown, R., & Poliakoff, E. (2011). Investigating the nature and time-course of the modality shift effect between vision and touch. Quarterly Journal of Experimental Psychology, 64, 871–88.Google Scholar
Miller, J. O. (1982). Divided attention: Evidence for coactivation with redundant signals. Cognitive Psychology, 14, 247–79.Google Scholar
Miller, J. O. (1991). Channel interaction and the redundant targets effect in bimodal divided attention. Journal of Experimental Psychology: Human Perception and Performance, 17, 160–9.Google Scholar
Miller, J., Ulrich, R., & Rolke, B. (2009). On the optimality of serial and parallel processing in the psychological refractory period paradigm: Effects of the distribution of stimulus onset asynchronies. Cognitive Psychology, 58, 273310.Google Scholar
Moeller, B., Zoppke, H., & Frings, C. (2016). What a car does to your perception: Distance evaluations differ from within and outside of a car. Psychonomic Bulletin & Review, 23, 781–8.Google Scholar
Mohebbi, R., Gray, R., & Tan, H. Z. (2009). Driver reaction time to tactile and auditory rear-end collision warnings while talking on a cell phone. Human Factors, 51, 102–10.Google Scholar
Molloy, K, Griffiths, T. D., Chait, M., & Lavie, N. (2015). Inattentional deafness: Visual load leads to time-specific suppression of auditory evoked responses. Journal of Neuroscience, 35, 16046–54. http://doi.org/10.1523/JNEUROSCI.2931-15.2015Google Scholar
Mondor, T. A., & Amirault, K. J. (1998). Effect of same- and different-modality spatial cues on auditory and visual target identification. Journal of Experimental Psychology: Human Perception & Performance, 24, 745–55.Google Scholar
Mondor, T. A., & Zatorre, R. J. (1995). Shifting and focusing auditory spatial attention. Journal of Experimental Psychology: Human Perception and Performance, 21, 387409.Google ScholarPubMed
Montagne, C., & Zhou, Y. (2018). Audiovisual interactions in front and rear space. Frontiers in Psychology, 9, 713. http://doi.org/10.3389/fpsyg.2018.00713Google Scholar
Moore, T., Armstrong, K. M., & Fallah, M. (2003). Visuomotor origins of covert spatial attention. Neuron, 40, 671–83.Google Scholar
Moris-Fernández, L., Visser, M., Ventura-Campos, N., Ávila, C., & Soto-Faraco, S. (2015). Top-down attention regulates the neural expression of audiovisual integration. NeuroImage, 119, 272–85.Google Scholar
Morrell, F. (1972). Visual system’s view of acoustic space. Nature, 238, 44–6.Google Scholar
Morrell, F. (1973). A reply to ‘Comments on “Visual system’s view of acoustic space” by Pöppel’. Nature, 243, 231.Google Scholar
Moseley, G. L., Gallace, A., & Spence, C. (2012). Bodily illusions in health and disease: Physiological and clinical perspectives and the concept of a cortical ‘body matrix’. Neuroscience & Biobehavioural Reviews, 36, 3446.Google Scholar
Mozolic, J. L., Joyner, D., Hugenschmidt, C. E., Peiffer, A. M., Kraft, R. A., Maldjian, J. A., & Laurienti, P. J. (2008). Cross-modal deactivations during modality-specific selective attention. BMC Neurology, 8, 35.Google Scholar
Mueller, H. J., & Rabbitt, P. M. (1989). Reflexive and voluntary orienting of visual attention: Time course of activation and resistance to interruption. Journal of Experimental Psychology: Human Perception and Performance, 15, 315–30. http://doi.org/10.1037/0096-1523.15.2.315Google Scholar
Mühlberg, S., Oriolo, G., & Soto‐Faraco, S. (2014). Cross‐modal decoupling in temporal attention. European Journal of Neuroscience, 39, 2089–97.Google Scholar
Mühlberg, S., & Soto-Faraco, S. (2018). Cross-modal decoupling in temporal attention between audition and touch. Psychological Research, 83, 1626–39. http://doi.org/10.1007/s00426-018-1023-6Google Scholar
Murphy, G., & Greene, C. M. (2017). Load theory behind the wheel; perceptual and cognitive load effects. Canadian Journal of Experimental Psychology, 71, 191202.Google Scholar
Murphy, S., & Dalton, P. (2016). Out of touch? Visual load induces inattentional numbness. Journal of Experimental Psychology: Human Perception and Performance, 42, 761–5.Google Scholar
Murphy, S., Dalton, P., & Spence, C. (2017). Selective attention in vision, audition, and touch. In Menzel, R. (Ed.), Learning theory and behavior, Vol. 1 of Learning and memory: A comprehensive reference, 2nd ed., J. Byrne (Series Ed.) (pp. 155–70). Oxford, UK: Academic Press.Google Scholar
Murphy, S., Spence, C., & Dalton, P. (2017). Auditory perceptual load: A critical review. Hearing Research, 352, 40–8.Google Scholar
Nagel, S. K., Carl, C., Kringe, T., Märtin, R., & König, P. (2005). Beyond sensory substitution-learning the sixth sense. Journal of Neural Engineering, 2, R13R26.Google Scholar
Navarra, J., Alsius, A., Soto-Faraco, S., & Spence, C. (2009). Assessing the role of attention in the audiovisual integration of speech. Information Fusion, 11, 411.Google Scholar
Naveteur, J., Honore, J., & Michael, G. A. (2005). How to detect an electrocutaneous shock which was not delivered? Overt spatial attention influences decision. Behavioural Brain Research, 165, 254–61.Google Scholar
Ngo, M. K., Pierce, R., & Spence, C. (2012). Utilizing multisensory cues to facilitate air traffic management. Human Factors, 54, 1093–103.Google Scholar
Nicholls, M. E. R., Roden, S., Thomas, N. A., Loetscher, T., Spence, C., & Forte, J. (2014). Close to me: The effect of asymmetrical environments on spatial attention. Ergonomics, 57, 876–85.Google Scholar
Noel, J.-P., Grivaz, P., Marmaroli, P., Lissek, H., Blanke, O., & Serino, A. (2015). Full body action remapping of peripersonal space: The case of walking. Neuropsychologia, 70, 375–84.Google Scholar
Noel, J.-P., Lukowska, M., Wallace, M., & Serino, A. (2016) Multisensory simultaneity judgment and proximity to the body. Journal of Vision, 16, 21. http://doi.org/10.1167/16.3.21Google Scholar
Noel, J.-P., Modi, K., Wallace, M. T., & Van der Stoep, N. (2018). Audiovisual integration in depth: Multisensory binding and gain as a function of distance. Experimental Brain Research, 236, 1939 –51.Google Scholar
Nuku, P. & Bekkering, H. (2010). When one sees what the other hears: Crossmodal attentional modulation for gazed and non-gazed upon auditory targets. Consciousness and Cognition, 19, 135–43.Google Scholar
Occelli, V., Hartcher-O’Brien, J., Spence, C., & Zampini, M. (2010). Assessing the audiotactile Colavita effect in near and rear space. Experimental Brain Research, 203, 517–32.Google Scholar
Occelli, V., Spence, C., & Zampini, M. (2011). Audiotactile interactions in front and rear space. Neuroscience & Biobehavioral Reviews, 35, 589–98.Google Scholar
Odegaard, B., Wozny, D. R., & Shams, L. (2016). The effects of selective and divided attention on sensory precision and integration. Neuroscience Letters, 614, 24–8.Google Scholar
Ora, H., Wada, M., Salat, D., & Kansaku, K. (2016). Arm crossing updates brain functional connectivity of the left posterior parietal cortex. Scientific Reports, 6:28105. http://doi.org/10.1038/srep28105.Google Scholar
Oray, S., Lu, Z. L., & Dawson, M. E. (2002). Modification of sudden onset auditory ERP by involuntary attention to visual stimuli. International Journal of Psychophysiology, 43, 213–24.Google Scholar
Orchard-Mills, E., Alais, D., & Van der Burg, E. (2013a). Amplitude-modulated auditory stimuli influence selection of visual spatial frequencies. Journal of Vision, 13, 6, 117.Google Scholar
Orchard-Mills, E, Alais, D., & Van der Burg, E. (2013b). Cross-modal associations between vision, touch and audition influence visual search through top-down attention, not bottom-up capture. Attention, Perception & Psychophysics, 75, 1892–1905.Google Scholar
Orchard-Mills, E., Van der Burg, E., & Alais, D. (2016). Crossmodal correspondence between auditory pitch and visual elevation affects temporal ventriloquism. Perception, 45, 409–24.Google Scholar
Orioli, G., Bremner, A. J., & Farroni, T. (2018). Multisensory perception of looming and receding objects in human newborns. Current Biology, 28, R1283R1295.Google Scholar
Oskarsson, P.-A., Eriksson, L., & Carlander, O. (2012). Enhanced perception and performance by multimodal threat cueing in simulated combat vehicle. Human Factors, 54, 122–37.Google Scholar
Otten, L. J., Alain, C., & Picton, T. W. (2000). Effects of visual attentional load on auditory processing. NeuroReport, 11, 875–80.Google Scholar
Otto, T. U., & Mamassian, P. (2012). Noise and correlations in parallel perceptual decision making. Current Biology, 22, 1391–6.Google Scholar
Overvliet, K. O., Azañón, E., & Soto-Faraco, S. (2011). Somatosensory saccades reveal the timing of tactile spatial remapping. Neuropsychologia, 49, 3046–52.Google Scholar
Oving, A. B., Veltmann, J. A., & Bronkhorst, A. W. (2004). Effectiveness of 3-D audio for warnings in the cockpit. International Journal of Aviation Psychology, 14, 257–76.Google Scholar
Oyer, J., & Hardick, J. (1963). Response of population to optimum warning signal. Office of Civil Defence, Final Report No. SHSLR163. Contract No. OCK-OS-62–182, September.Google Scholar
Palmiero, M., Piccardi, L., Boccia, M., Baralla, F., Cordellieri, P., Sgalla, R., Guidoni, U., & Giannini, A. M. (2019). Neural correlates of simulated driving while performing a secondary task: A review. Frontiers in Psychology, 10, 1045. http://doi.org/10.3389/fpsyg.2019.01045Google Scholar
Pápai, M. S. (2017). Behavioral and electrophysiological correlates of cross-modal enhancement for unaware visual events (Doctoral dissertation, Universitat Pompeu Fabra). https://www.tdx.cat/handle/10803/664283Google Scholar
Pápai, M. S., & Soto-Faraco, S. (2017). Sounds can boost the awareness of visual events through attention without cross-modal integration. Scientific Reports, 7, 41684.Google Scholar
Parks, N. A., Hilimire, M. R., & Corballis, P. M. (2009). Visual perceptual load modulates an auditory microreflex. Psychophysiology, 46, 498501.Google Scholar
Pashler, H. (1992). Attentional limitations in doing two tasks at the same time. Current Directions in Psychological Science, 1, 44–8.Google Scholar
Pashler, H. (1994). Dual-task interference in simple tasks: Data and theory. Psychological Bulletin, 116, 220–44.CrossRefGoogle ScholarPubMed
Pashler, H., Johnston, J. C., & Ruthruff, E. (2001). Attention and performance. Annual Review of Psychology, 52, 629–51.Google Scholar
Patten, C. J. D., Kircher, A., Ostlund, J., & Nilsson, L. (2004). Using mobile telephones: Cognitive workload and attention resource allocation. Accident Analysis & Prevention, 36, 341–50.Google Scholar
Peelen, M. V., & Kastner, S. (2014). Attention in the real world: Toward understanding its neural basis. Trends in Cognitive Sciences, 18, 242–50.Google Scholar
Perrott, D. R., Cisneros, J., McKinley, R. L., & D’Angelo, W. (1996). Aurally aided visual search under virtual and free-field listening conditions. Human Factors, 38, 702–15.Google Scholar
Perrott, D. R., Saberi, K., Brown, K., & Strybel, T. Z. (1990). Auditory psychomotor coordination and visual search performance. Perception & Psychophysics, 48, 214–26.Google Scholar
Perrott, D. R., Sadralodabai, T., Saberi, K., & Strybel, T. Z. (1991). Aurally aided visual search in the central visual field: Effects of visual load and visual enhancement of the target. Human Factors, 33, 389400.Google Scholar
Pjetermeijer, S., Bazilinskyy, P., Bengler, K., & de Winter, J. (2017). Take-over again: Investigating multimodal and directional TORs to get the driver back into the loop. Applied Ergonomics, 62, 204–15.Google Scholar
Poliakoff, E., Ashworth, S., Lowe, C., & Spence, C. (2006). Vision and touch in ageing: Crossmodal selective attention and visuotactile spatial interactions. Neuropsychologia, 44, 507–17.Google Scholar
Poliakoff, E., Miles, E., Li, X., & Blanchette, I. (2007). The effect of visual threat on spatial attention to touch. Cognition, 102, 405–14.Google Scholar
Pomper, U., Keil, J., Foxe, J. J., & Senkowski, D. (2015). Intersensory selective attention and temporal orienting operate in parallel and are instantiated in spatially distinct sensory and motor cortices. Human Brain Mapping, 36, 3246–59.Google Scholar
Pöppel, E. (1973). Comments on ‘Visual system’s view of acoustic space’. Nature, 243, 231.Google Scholar
Populin, L. C., & Yin, T. C. T. (1998). Sensitivity of auditory cells in the superior colliculus to eye position in the behaving cat. In Palmer, A. R., Rees, A., Summerfield, Q., & Meddis, R. (Eds.), Psychophysical and physiological advances in hearing (pp. 441–8). London, UK: Whurr.Google Scholar
Populin, L. C., & Yin, T. C. T. (2002). Bimodal interactions in the superior colliculus of the behaving cat. Journal of Neuroscience, 22, 2826–34.Google Scholar
Posner, M. I. (1978). Chronometric explorations of mind. Hillsdale, NJ: Erlbaum.Google Scholar
Posner, M. I. (1990). Hierarchical distributed networks in the neuropsychology of selective attention. In Caramazza, A. (Ed.), Cognitive neuropsychology and neurolinguistics: Advances in models of cognitive function and impairment (pp. 187210). Hillsdale, NJ: Erlbaum.Google Scholar
Posner, M. I., & Cohen, Y. (1984). Components of visual orienting. In Bouma, H. & Bouwhuis, D. G. (Eds.), Attention and performance: Control of language processes (Vol. 10, pp. 531–56). Hillsdale, NJ: Erlbaum.Google Scholar
Potter, M. C., Chun, M. M., Banks, B. S., & Muckenhoupt, M. (1998). Two attentional deficits in serial target search: The visual attentional blink and an amodal task-switch deficit. Journal of Experimental Psychology: Learning, Memory, & Cognition, 24, 979–92.Google Scholar
Pouget, A., Deneve, S., & Duhamel, J.-R. (2002). A computational perspective on the neural basis of multisensory spatial representations. Nature Reviews Neuroscience, 3, 741–7.Google Scholar
Pouget, A., Deneve, S., & Duhamel, J.-R. (2004). A computational neural theory of multisensory spatial representations. In Spence, C. & Driver, J. (Eds.), Crossmodal space and crossmodal attention (pp. 123–40). Oxford, UK: Oxford University Press.Google Scholar
Pratt, J., & Abrams, R. A. (1999). Inhibition of return in discrimination tasks. Journal of Experimental Psychology: Human Perception and Performance, 25, 229–42.Google Scholar
Previc, F. H. (1998). The neuropsychology of 3-D space. Psychological Bulletin, 124, 123–64.Google Scholar
Previc, F. H. (2000). Neuropsychological guidelines for aircraft control stations. IEEE Engineering in Medicine and Biology Magazine, 19, 81–8.Google Scholar
Prime, D. J., McDonald, J. J., Green, J., & Ward, L. M. (2008). When crossmodal attention fails: A controversy resolved? Canadian Journal of Experimental Psychology, 62, 192–7.Google Scholar
Prinzmetal, W., McCool, C., & Park, S. (2005a). Attention: Reaction time and accuracy reveal different mechanisms. Journal of Experimental Psychology: General, 134, 7392.Google Scholar
Prinzmetal, W., Park, S., & Garrett, R. (2005b). Involuntary attention and identification accuracy. Perception & Psychophysics, 67, 1344–53.Google Scholar
Proctor, R. W., & Vu, K.-P. L. (2016). Principles for designing interfaces compatible with human information processing. International Journal of Human Computer Interaction, 32, 222.Google Scholar
Ramachandran, V. S., Altschuler, E. L., & Hillyer, S. (1997). Mirror agnosia. Proceedings of the Royal Society London B, 264, 645–7.Google Scholar
Raymond, J. E., Shapiro, K. L., & Arnell, K. M. (1992). Temporary suppression of visual processing in an RSVP task: An attentional blink? Journal of Experimental Psychology: Human Perception and Performance, 18, 849–60.Google Scholar
Redelmeier, D. A., & Tibshirani, R. J. (1997). Association between cellular-telephone calls and motor vehicle collisions. New England Journal of Medicine, 336, 453–8.Google Scholar
Reed, C. L., Grubb, J. D., & Steele, C. (2006). Hands up: Attentional prioritization of space near the hand. Journal of Experimental Psychology: Human Perception & Performance, 32, 166–77.Google Scholar
Rees, G., Frith, C., & Lavie, N. (2001). Processing of irrelevant visual motion during performance of an auditory attention task. Neuropsychologia, 39, 937–49.Google Scholar
Reisberg, D. (1978). Looking where you listen: Visual cues and auditory attention. Acta Psychologica, 42, 331–41.Google Scholar
Ribot, T. (1898). The psychology of attention. Chicago, IL: Open Court Publishing Company.Google Scholar
Richard, C. M., Wright, R. D., Ee, C., Prime, S. L., Shimizu, Y., & Vavrik, J. (2002). Effect of a concurrent auditory task on visual search performance in a driving-related image-flicker task. Human Factors, 44, 108–19.Google Scholar
Risko, E. F., & Kingstone, A. (2011). Eyes wide shut: Implied social presence, eye tracking and attention. Attention, Perception, & Psychophysics, 73, 291–6.Google Scholar
Risko, E. F., Richardson, D. C., & Kingstone, A. (2016). Breaking the fourth wall of cognitive science: Real-world social attention and the dual function of gaze. Current Directions in Psychological Science, 25, 70–4.Google Scholar
Röder, B., & Büchel, C. (2009). Multisensory interactions within and outside the focus of visual spatial attention (commentary on Fairhall & Macaluso). European Journal of Neuroscience, 29, 1245–6.Google Scholar
Röder, B., Rösler, F., & Spence, C. (2004). Early vision impairs tactile perception in the blind. Current Biology, 14, 121–4.Google Scholar
Romei, V., Gross, J., & Thut, G. (2012). Sounds reset rhythms of visual cortex and corresponding human visual perception. Current Biology, 22, 807–13. http://doi.org/10.1016/j.cub.2012.03.025Google Scholar
Rorden, C., & Driver, J. (1999). Does auditory attention shift in the direction of an upcoming saccade? Neuropsychologia, 37, 357–77.Google Scholar
Rozin, P. (2006). Domain denigration and process preference in academic psychology. Perspectives in Psychological Science, 1, 365–76.Google Scholar
Rudmann, D. S., & Strybel, T. Z. (1999). Auditory spatial facilitation of visual search performance: Effect of cue precision and distractor density. Human Factors, 41, 146–60.Google Scholar
Ruzzoli, M., & Soto-Faraco, S. (2017). Modality-switching in the Simon task: The clash of reference frames. Journal of Experimental Psychology: General, 146, 1478–97.Google Scholar
Sambo, C. F., & Iannetti, G. D. (2013). Better safe than sorry? The safety margin surrounding the body is increased by anxiety. Journal of Neuroscience, 33, 14225–30. http://doi.org/10.1523/JNEUROSCI.0706–13.2013Google Scholar
Sanabria, D., Soto-Faraco, S., & Spence, C. (2007). Spatial attention modulates audiovisual interactions in apparent motion. Journal of Experimental Psychology: Human Perception and Performance, 33, 927–37.Google Scholar
Sandhu, R., & Dyson, B. J. (2016). Cross-modal perceptual load: The impact of modality and individual differences. Experimental Brain Research, 234, 1279–91.Google Scholar
Santangelo, V., Belardinelli, M. O., & Spence, C. (2007). The suppression of reflexive visual and auditory orienting when attention is otherwise engaged. Journal of Experimental Psychology: Human Perception & Performance, 33, 137–48.Google Scholar
Santangelo, V., Finoia, P., Raffone, A., Olivetti Belardinelli, M., & Spence, C. (2008a). Perceptual load affects exogenous spatial orienting while working memory load does not. Experimental Brain Research, 184, 371–82.Google Scholar
Santangelo, V., Ho, C., & Spence, C. (2008b). Capturing spatial attention with multisensory cues. Psychonomic Bulletin & Review, 15, 398403.Google Scholar
Santangelo, V., Olivetti Belardinelli, M., Spence, C., & Macaluso, E. (2009). Multisensory interactions between voluntary and stimulus-driven spatial attention mechanisms across sensory modalities. Journal of Cognitive Neuroscience, 21, 2384–97.Google Scholar
Santangelo, V., & Spence, C. (2007a). Multisensory cues capture spatial attention regardless of perceptual load. Journal of Experimental Psychology: Human Perception & Performance, 33, 1311–21.Google Scholar
Santangelo, V., & Spence, C. (2007b). Assessing the automaticity of the exogenous orienting of tactile attention. Perception, 36, 1497–505.Google Scholar
Santangelo, V., & Spence, C. (2007c). Assessing the effect of verbal working memory load on visuo-spatial exogenous orienting. Neuroscience Letters, 413, 105–9.Google Scholar
Santangelo, V., & Spence, C. (2008a). Is the exogenous orienting of spatial attention truly automatic? Evidence from unimodal and multisensory studies. Consciousness and Cognition, 17, 9891015.Google Scholar
Santangelo, V., & Spence, C. (2008b). Crossmodal attentional capture in an unspeeded simultaneity judgement task. Visual Cognition, 16, 155–65.Google Scholar
Sarter, N. B. (2000). The need for multisensory interfaces in support of effective attention allocation in highly dynamic event-driven domains: The case of cockpit automation. International Journal of Aviation Psychology, 10, 231–45.Google Scholar
Sarter, N. B. (2002). Multimodal information presentation in support of human-automation communication and coordination. In Salas, E. (Ed.), Advances in human performance and cognitive engineering research (pp. 1336). New York, NY: JAI Press.Google Scholar
Sarter, N. B. (2006). Multimodal human-machine interfaces: Design guidance and research challenges. International Journal of Industrial Ergonomics, 36, 439–45.Google Scholar
Sarter, N. B. (2007). Multiple-resource theory as a basis for multimodal interface design: Success stories, qualifications, and research needs. In Kramer, A. F., Wiegmann, D. A., & Kirlik, A. (Eds.), Attention: From theory to practice (pp. 187–95). Oxford, UK: Oxford University Press.Google Scholar
Scerra, V., & Brill, J. C. (2012). Effect of task modality on dual-task performance, response time, and ratings of operator workload. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 56, 1456–60.Google Scholar
Scharf, B. (1998). Auditory attention: The psychoacoustical approach. In Pashler, H. (Ed.), Attention (pp. 75117). London, UK: Psychology Press.Google Scholar
Schicke, T., & Röder, B. (2006). Spatial remapping of touch: Confusion of perceived stimulus order across hand and foot. Proceedings of the National Academy of Sciences of the United States of America, 103, 11808–13. http://doi.org/10.1073/pnas.0601486103Google Scholar
Schicke, T., Bauer, F., & Röder, B. (2009). Interactions of different body parts in peripersonal space: How vision of the foot influences tactile perception at the hand. Experimental Brain Research, 192, 703–15.Google Scholar
Schmitt, M., Postma, A., & de Haan, E. (2000). Interactions between exogenous auditory and visual spatial attention. Quarterly Journal of Experimental Psychology, 53A, 105–30.Google Scholar
Schmitt, M., Postma, A., & de Haan, E. (2001). Cross-modal exogenous attention and distance effects in vision and hearing. European Journal of Cognitive Psychology, 13, 343–68.Google Scholar
Schneider, K. A., & Bavelier, D. (2003). Components of visual prior entry. Cognitive Psychology, 47, 333–66.Google Scholar
Schreiber, T., & White, T. L. (2013). Detect, reject, focus: The role of satiation and odor relevance in cross-modal attention. Chemosensory Perception, 6, 170–8.Google Scholar
Schroeder, C. E., & Lakatos, P. (2009). Low-frequency neural oscillations as instruments of sensory selection. Trends in Neurosciences, 32, 918.Google Scholar
Schumacher, E. H., Seymour, T. L., Glass, J. M., Fencsik, D. E., Lauber, E. J., Kieras, D. E., & Meyer, D. E. (2001). Virtually perfect time sharing in dual-task performance: Uncorking the central cognitive bottleneck. Psychological Science, 12, 101–8.Google Scholar
Seigneuric, A., Durand, K., Jiang, T., Baudouin, J.-Y., & Schaal, B. (2010). The nose tells it to the eyes: Crossmodal associations between olfaction and vision. Perception, 39, 1541–54.Google Scholar
Senders, J. W., Kristofferson, A. B., Levison, W. H., Dietrich, C. W., & Ward, J. L. (1967). The attentional demand of automobile driving. Highway Research Record, 195, 1533.Google Scholar
Seo, H.-S., Roidl, E., Müller, F., & Negoias, S. (2010). Odors enhance visual attention to congruent objects. Appetite, 54, 544–9.Google Scholar
Serences, J. T., Shomstein, S., Leber, A. B., Golav, X., Egeth, H. E., & Yantis, S. (2005). Coordination of voluntary and stimulus-driven attentional control in human cortex. Psychological Science, 16, 114–22.Google Scholar
Serino, A., Annella, L., & Avenanti, A. (2009). Motor properties of peripersonal space in humans. PLoS One, 4, e6582.Google Scholar
Shiffrin, R. M., & Grantham, D. W. (1974). Can attention be allocated to sensory modalities? Perception & Psychophysics, 15, 460–74.Google Scholar
Shomstein, S., & Yantis, S. (2004). Control of attention shifts between vision and audition in human cortex. Journal of Neuroscience, 24, 10702–6.Google Scholar
Shore, D. I., Barnes, M. E., & Spence, C. (2006). The temporal evolution of the crossmodal congruency effect. Neuroscience Letters, 392, 96100.Google Scholar
Shore, D. I., Spence, C., & Klein, R. M. (2001). Visual prior entry. Psychological Science, 12, 205–12.Google Scholar
Shore, D. I., Spry, E., & Spence, C. (2002). Confusing the mind by crossing the hands. Cognitive Brain Research, 14, 153–63.Google Scholar
Simon, J. R. (1990). The effects of an irrelevant directional cue on human information processing. In Proctor, R. W. & Reeve, T. G. (Eds.), Stimulus-response compatibility (pp. 3186). Amsterdam, NL: Elsevier Science.Google Scholar
Simon, J. R., & Craft, J. L. (1970). Effects of an irrelevant auditory stimulus on visual choice reaction time. Journal of Experimental Psychology, 86, 272–4.Google Scholar
Sinnett, S., Costa, A., & Soto-Faraco, S. (2006). Manipulating inattentional blindness within and across sensory modalities. Quarterly Journal of Experimental Psychology, 59, 1425–42.CrossRefGoogle ScholarPubMed
Sivak, M. (1996). The information that drivers use: Is it indeed 90% visual? Perception, 25, 1081–9.Google Scholar
Soret, R., Hurter, C., & Peysakhovich, V. (2019). Attentional orienting in real and virtual 360-degree environments: Applications to aeronautics. Paper presented at The 11th ACM Symposium Conference, June. http://doi.org/10.1145/3314111.3322871Google Scholar
Soto-Faraco, S., Biau, E., Moris-Fernandez, L., Ikumi, N., Kvasova, D., Ruzzoli, M., & Torralba, M. (2019). Multisensory integration in the real world. Cambridge elements of perception. Cambridge, UK: Cambridge University Press.Google Scholar
Soto-Faraco, S., Morein-Zamir, S., & Kingstone, A. (2005). On audiovisual spatial synergy: The fragility of the phenomenon. Perception & Psychophysics, 67, 444–57.Google Scholar
Soto-Faraco, S., Navarra, J., & Alsius, A. (2004). Assessing automaticity in audiovisual speech integration: Evidence from the speeded classification task. Cognition, 92, B13B23.Google Scholar
Soto-Faraco, S., & Spence, C. (2002). Modality-specific auditory and visual temporal processing deficits. Quarterly Journal of Experimental Psychology (A), 55, 2340.Google Scholar
Soto-Faraco, S., Spence, C., Fairbank, K., Kingstone, A., Hillstrom, A. P., & Shapiro, K. (2002). A crossmodal attentional blink between vision and touch. Psychonomic Bulletin & Review, 9, 731–8.Google Scholar
Spence, C. (2008). Searching for the bottleneck in the brain. Current Biology, 18, R965R968.Google Scholar
Spence, C. (2010a). Crossmodal attention. Scholarpedia, 5, 6309. http://doi.org/10.4249/scholarpedia.6309Google Scholar
Spence, C. (2010b). Crossmodal spatial attention. Annals of the New York Academy of Sciences (The Year in Cognitive Neuroscience), 1191, 182200.Google Scholar
Spence, C. (2011a). Crossmodal correspondences: A tutorial review. Attention, Perception, & Psychophysics, 73, 971–95.Google Scholar
Spence, C. (2011b). Assessing the consequences of tool-use for the representation of peripersonal space in humans. In McCormack, T., Hoerl, C., & Butterfill, S. (Eds.), Tool use and causal cognition (pp. 220–47). Oxford, UK: Oxford University Press.Google Scholar
Spence, C. (2012). Drive safely with neuroergonomics. The Psychologist, 25, 664–7.Google Scholar
Spence, C. (2014). Orienting attention: A crossmodal perspective. In Nobre, A. C. & Kastner, S. (Eds.), The Oxford handbook of attention (pp. 446–71). Oxford, UK: Oxford University Press.Google Scholar
Spence, C. (2018). Multisensory perception. In Wixted, J. (Ed.-in-Chief), J. Serences (Vol. Ed.), The Stevens’ handbook of experimental psychology and cognitive neuroscience (4th ed., Vol. 2, pp. 156). Hoboken, NJ: John Wiley & Sons.Google Scholar
Spence, C. (2019a). On the relative nature of (pitch-based) crossmodal correspondences. Multisensory Research, 32, 235–65.Google Scholar
Spence, C. (2019b). Attending to the chemical senses. Multisensory Research, 32, 635–64.Google Scholar
Spence, C., & Deroy, O. (2013b). How automatic are crossmodal correspondences? Consciousness and Cognition, 22, 245–60. http://doi.org/10.1016/j.concog.2012.12.006Google Scholar
Spence, C. [J.], & Driver, J. (1994). Covert spatial orienting in audition: Exogenous and endogenous mechanisms. Journal of Experimental Psychology: Human Perception and Performance, 20, 555–74.Google Scholar
Spence, C., & Driver, J. (1996). Audiovisual links in endogenous covert spatial attention. Journal of Experimental Psychology: Human Perception and Performance, 22, 1005–30.Google Scholar
Spence, C., & Driver, J. (1997a). Audiovisual links in exogenous covert spatial orienting. Perception & Psychophysics, 59, 122.Google Scholar
Spence, C., & Driver, J. (1997b). Cross-modal links in attention between audition, vision, and touch: Implications for interface design. International Journal of Cognitive Ergonomics, 1, 351–73.Google Scholar
Spence, C., & Driver, J. (1999). A new approach to the design of multimodal warning signals. In Harris, D. (Ed.), Engineering psychology and cognitive ergonomics, Vol. 4: Job design, product design and human-computer interaction (pp. 455–61). Hampshire: Ashgate Publishing.Google Scholar
Spence, C., & Driver, J. (2000). Attracting attention to the illusory location of a sound: Reflexive crossmodal orienting and ventriloquism. NeuroReport, 11, 2057–61.Google Scholar
Spence, C., & Driver, J. (Eds.). (2004). Crossmodal space and crossmodal attention. Oxford: Oxford University Press.Google Scholar
Spence, C., & Ho, C. (2008a). Crossmodal information processing in driving. In Castro, C. (Ed.), Human factors of visual performance in driving (pp. 187200). Boca Raton, FL: CRC Press.Google Scholar
Spence, C., & Ho, C. (2008b). Multisensory warning signals for event perception and safe driving. Theoretical Issues in Ergonomics Science, 9, 523–54.Google Scholar
Spence, C., & Ho, C. (2015a). Crossmodal attention: From the laboratory to the real world (and back again). In Fawcett, J. M., Risko, E. F., & Kingstone, A. (Eds.), The handbook of attention (pp. 119–38). Cambridge, MA: MIT Press.Google Scholar
Spence, C., & Ho, C. (2015b). Multisensory perception. In Boehm-Davis, D. A., Durso, F. T., & Lee, J. D. (Eds.), Handbook of human systems integration (pp. 435–48). Washington, DC: American Psychological Association.Google Scholar
Spence, C., Kettenmann, B., Kobal, G., & McGlone, F. P. (2000). Selective attention to the chemosensory modality. Perception & Psychophysics, 62, 1265–71.Google Scholar
Spence, C., Kettenmann, B., Kobal, G., & McGlone, F. P. (2001a). Attention to olfaction: A psychophysical investigation. Experimental Brain Research, 138, 432–7.Google Scholar
Spence, C., Kettenmann, B., Kobal, G., & McGlone, F. P. (2001b). Shared attentional resources for processing vision and chemosensation. Quarterly Journal of Experimental Psychology, 54A, 775–83.Google Scholar
Spence, C., Kingstone, A., Shore, D. I., & Gazzaniga, M. S. (2001). Representation of visuotactile space in the split brain. Psychological Science, 12, 90–3.Google Scholar
Spence, C., Lee, J., & Van der Stoep, N. (2017). Responding to sounds from unseen locations: Crossmodal attentional orienting in response to sounds presented from the rear. European Journal of Neuroscience, 13733, 114. http://doi.org/10.1111/ejn.13733Google Scholar
Spence, C., Lloyd, D., McGlone, F., Nicholls, M. E. R., & Driver, J. (2000). Inhibition of return is supramodal: A demonstration between all possible pairings of vision, touch and audition. Experimental Brain Research, 134, 42–8.Google Scholar
Spence, C., McDonald, J., & Driver, J. (2004). Exogenous spatial cuing studies of human crossmodal attention and multisensory integration. In Spence, C. & Driver, J. (Eds.), Crossmodal space and crossmodal attention (pp. 277320). Oxford, UK: Oxford University Press.Google Scholar
Spence, C., & Ngo, M. K. (2012). Does attention or multisensory integration explain the crossmodal facilitation of masked visual target identification? In Stein, B. E. (Ed.), The new handbook of multisensory processing (pp. 345–58). Cambridge, MA: MIT Press.Google Scholar
Spence, C., Nicholls, M. E. R., & Driver, J. (2001a). The cost of expecting events in the wrong sensory modality. Perception & Psychophysics, 63, 330–6.Google Scholar
Spence, C., Nicholls, M. E. R., Gillespie, N., & Driver, J. (1998). Cross-modal links in exogenous covert spatial orienting between touch, audition, and vision. Perception & Psychophysics, 60, 544–57.Google Scholar
Spence, C., Parise, C., & Chen, Y.-C. (2011). The Colavita visual dominance effect. In Murray, M. M. & Wallace, M. (Eds.), Frontiers in the neural bases of multisensory processes (pp. 523–50). Boca Raton, FL: CRC Press.Google Scholar
Spence, C., Pavani, F., & Driver, J. (2000). Crossmodal links between vision and touch in covert endogenous spatial attention. Journal of Experimental Psychology: Human Perception & Performance, 26, 1298–319.Google Scholar
Spence, C., Pavani, F., & Driver, J. (2004a). Spatial constraints on visual-tactile crossmodal distractor congruency effects. Cognitive, Affective, & Behavioral Neuroscience, 4, 148–69.Google Scholar
Spence, C., Pavani, F., Maravita, A., & Holmes, N. (2004b). Multisensory contributions to the 3-D representation of visuotactile peripersonal space in humans: Evidence from the crossmodal congruency task. Journal of Physiology (Paris), 98, 171–89.Google Scholar
Spence, C., Pavani, F., Maravita, A., & Holmes, N. P. (2008). Multi-sensory interactions. In Lin, M. C. & Otaduy, M. A. (Eds.), Haptic rendering: Foundations, algorithms, and applications (pp. 2152). Wellesley, MA: AK Peters.Google Scholar
Spence, C., Ranson, J., & Driver, J. (2000). Crossmodal selective attention: Ignoring auditory stimuli presented at the focus of visual attention. Perception & Psychophysics, 62, 410–24.Google Scholar
Spence, C., & Read, L. (2003). Speech shadowing while driving: On the difficulty of splitting attention between eye and ear. Psychological Science, 14, 251–6.Google Scholar
Spence, C., & Santangelo, V. (2009). Capturing spatial attention with multisensory cues: A review. Hearing Research, 258, 134–42.Google Scholar
Spence, C., Shore, D. I., & Klein, R. M. (2001). Multimodal prior entry. Journal of Experimental Psychology: General, 130, 799832.Google Scholar
Spence, C., & Squire, S. B. (2003). Multisensory integration: Maintaining the perception of synchrony. Current Biology, 13, R519R521.Google Scholar
Stanney, K., Samman, S., Reeves, L., Hale, K., Buff, W., Bowers, C., Goldiez, B., Nicholson, D., & Lackey, S. (2004). A paradigm shift in interactive computing: Deriving multimodal design principles from behavioral and neurological foundations. International Journal of Human-Computer Interaction, 17, 229–57.Google Scholar
Stein, B. E., & Meredith, M. A. (1993). The merging of the senses. Cambridge, MA: MIT Press.Google Scholar
Stevenson, R. A., Krueger Fister, J., Barnett, Z. P., Nidiffer, A. R., & Wallace, M. T. (2012). Interactions between the spatial and temporal stimulus factors that influence multisensory integration in human performance. Experimental Brain Research, 219, 121–37.Google Scholar
Stevenson, R. J., & Attuquayefilo, T. (2013). Human olfactory consciousness and cognition: Its unusual features may not result from unusual functions but from limited neocortical processing resources. Frontiers in Psychology, 4, 819.Google Scholar
Störmer, V. S. (2019). Orienting spatial attention to sounds enhances visual perception. Current Opinion in Psychological Science, 29, 193–8.Google Scholar
Störmer, V. S., Feng, W., Martinez, A., McDonald, J. J., & Hillyard, S. A. (2016) Salient, irrelevant sounds reflexively induce alpha rhythm desynchronization in parallel with slow potential shifts in visual cortex. Journal of Cognitive Neuroscience, 28, 433–45.Google Scholar
Störmer, V. S., McDonald, J. J., & Hillyard, S. A. (2009). Cross-modal cueing of attention alters appearance and early cortical processing of visual stimuli. Proceedings of the National Academy of Sciences of the USA, 106, 22456–61.Google Scholar
Strayer, D. L., Cooper, J. M., Goethe, R. M., McCarty, M. M., Getty, D. J., & Biondi, F. (2019). Assessing the visual and cognitive demands of in-vehicle information systems. Cognitive Research: Principles and Implications, 4, 5.Google Scholar
Strayer, D. L., & Drews, F. A. (2007). Multitasking in the automobile. In Kramer, A. F., Wiegmann, D. A., & Kirlik, A. (Eds.), Attention: From theory to practice (pp. 121–33). Oxford, UK: Oxford University Press.Google Scholar
Strayer, D. L., Drews, F. A., & Johnston, W. A. (2003). Cell phone-induced failures of visual attention during simulated driving. Journal of Experimental Psychology: Applied, 9, 2332.Google Scholar
Strayer, D. L., & Johnston, W. A. (2001). Driven to distraction: Dual-task studies of simulated driving and conversing on a cellular telephone. Psychological Science, 12, 462–6.Google Scholar
Streicher, M. C., & Estes, Z. (2016). Multisensory interaction in product choice: Grasping a product affects choice of other seen products. Journal of Consumer Psychology, 26, 556–65.Google Scholar
Suetomi, T., & Kido, K. (1997). Driver behavior under a collision warning system – A driving simulator study. SAE Technical Publication, 970279, 1242, 7581.Google Scholar
Szczepanski, S. M., & Kastner, S. (2013). Shifting attentional priorities: Control of spatial attention through hemispheric competition. Journal of Neuroscience, 33, 5411–21.Google Scholar
Taffou, M., & Viaud-Delmon, I. (2014). Cynophobic fear adaptively extends peri-personal space. Frontiers in Psychiatry, 5, 122.Google Scholar
Talsma, D., Doty, T. J., Strowd, R., & Woldorff, M. G. (2006). Attentional capacity for processing concurrent stimuli is larger across modalities than within a modality. Psychophysiology, 43, 541–9.Google Scholar
Talsma, D., Doty, T. J., & Woldorff, M. G. (2007). Selective attention and audiovisual integration: Is attending to both modalities a prerequisite for early integration? Cerebral Cortex, 17, 691701.Google Scholar
Talsma, D., Senkowski, D., Soto-Faraco, S., & Woldorff, M. G. (2010). The multifaceted interplay between attention and multisensory integration. Trends in Cognitive Sciences, 14, 400–10.Google Scholar
Talsma, D., & Woldorff, M. G. (2005). Attention and multisensory integration: Multiple phases of effects on the evoked brain activity. Journal of Cognitive Neuroscience, 17, 1098–114.Google Scholar
Tang, X. Y., Wu, J. L., & Shen, Y. (2016). The interactions of multisensory integration with endogenous and exogenous attention. Neuroscience & Biobehavioral Reviews, 61, 208–24.Google Scholar
Taylor-Clarke, M., Kennett, S., & Haggard, P. (2002). Vision modulates somatosensory cortical processing. Current Biology, 12, 233–6.Google Scholar
Tellinghuisen, D. J., & Nowak, E. J. (2003). The inability to ignore auditory distractors as a function of visual task perceptual load. Perception & Psychophysics, 65, 817–28.Google Scholar
Teneggi, C., Canzoneri, E., di Pellegrino, G., & Serino, A. (2013). Social modulation of peripersonal space boundaries. Current Biology, 23, 406–11.Google Scholar
Thomas, N., & Flew, A. (2016). The multisensory integration of auditory distractors and visuospatial attention. Journal of Vision, 16, 147.Google Scholar
Tipper, S. P., Lloyd, D., Shorland, B., Dancer, C., Howard, L. A., & McGlone, F. (1998). Vision influences tactile perception without proprioceptive orienting. Neuroreport, 9, 1741–4.Google Scholar
Tipper, S. P., Phillips, N., Dancer, C., Lloyd, D., Howard, L. A., & McGlone, F. (2001). Vision influences tactile perception at body sites that cannot be viewed directly. Experimental Brain Research, 139, 160–7.Google Scholar
Titchener, E. B. (1908). Lectures on the elementary psychology of feeling and attention. New York, NY: Macmillan.Google Scholar
Treisman, A. M., & Davies, A. (1973). Divided attention to ear and eye. In Kornblum, S. (Ed.) Attention and performance (Vol 4, pp. 101–17). New York, NY: Academic Press.Google Scholar
Tsal, Y., & Benoni, H. (2010). Diluting the burden of load: Perceptual load effects are simply dilution effects. Journal of Experimental Psychology: Human Perception and Performance, 36, 1645–57.Google Scholar
Turatto, M., Benso, F., Galfano, G., Gamberini, L., & Umiltà, C. (2002). Non-spatial attentional shifts between audition and vision. Journal of Experimental Psychology: Human Perception & Performance, 28, 628–39.Google Scholar
Turatto, M., Galfano, G., Bridgeman, B., & Umiltà, C. (2004). Space-independent modality-driven attentional capture in auditory, tactile and visual systems. Experimental Brain Research, 155, 301–10.Google Scholar
Turatto, M., Mazza, V., & Umiltà, C. (2005). Crossmodal object-based attention: Auditory objects affect visual processing. Cognition, 96, B55B64.Google Scholar
Van Damme, S., Crombez, G., & Spence, C. (2009). Is the visual dominance effect modulated by the threat value of visual and auditory stimuli? Experimental Brain Research, 193, 197204.Google Scholar
Van Damme, S., Gallace, A., Spence, C., Crombez, G., & Moseley, G. L. (2009). Does the sight of physical threat induce a tactile processing bias? Modality-specific attentional facilitation induced by viewing threatening pictures. Brain Research, 1253, 100–6.Google Scholar
Van der Burg, E., Olivers, C. N. L., Bronkhorst, A. W., Koelewijn, T., & Theeuwes, J. (2007). The absence of an auditory-visual attentional blink is not due to echoic memory. Perception & Psychophysics, 69, 1230–41.Google Scholar
Van der Burg, E., Olivers, C. N. L., Bronkhorst, A. W., & Theeuwes, J. (2008a). Pip and pop: Non-spatial auditory signals improve spatial visual search. Journal of Experimental Psychology: Human Perception and Performance, 34, 1053–65.Google Scholar
Van der Burg, E., Olivers, C. N. L., Bronkhorst, A. W., & Theeuwes, J. (2008b). Audiovisual events capture attention: Evidence from temporal order judgments. Journal of Vision, 8(2), 110.Google Scholar
van der Lubbe, R. H. J., & Postma, A. (2005). Interruption from auditory and visual onsets even when attention is in a focused state. Experimental Brain Research, 164, 464–71.Google Scholar
Van der Stoep, N., Nijboer, T. C. W., Van der Stigchel, S., & Spence, C. (2015a). Multisensory interactions in the depth plane in front and rear space: A review. Neuropsychologia, 70, 335–49.Google Scholar
Van der Stoep, N., Serino, A., Farnè, A., Di Luca, M., & Spence, C. (2016a). Depth: The forgotten dimension in multisensory research. Multisensory Research, 29, 493524.Google Scholar
Van der Stoep, N., Spence, C., Nijboer, T. C. W., & Van der Stigchel, S. (2015b). On the relative contributions of multisensory integration and crossmodal exogenous spatial attention to multisensory response enhancement. Acta Psychologica, 162, 20–8.Google Scholar
Van der Stoep, N., Van der Stigchel, S., & Nijboer, T. C. (2015c). Exogenous spatial attention decreases audiovisual integration. Attention, Perception, & Psychophysics, 77, 464–82.Google Scholar
Van der Stoep, N., Van der Stigchel, S., Nijboer, T. C. W., & Spence, C. (2017). Visually-induced inhibition of return affects the integration of auditory and visual information. Perception, 46, 617.Google Scholar
Van der Stoep, N., Van der Stigchel, S., Nijboer, T. C. W., & Van der Smagt, M. J. (2016). Audiovisual integration in near and far space: Effects of changes in distance and stimulus effectiveness. Experimental Brain Research, 234, 1175–88.Google Scholar
Van der Stoep, N., Visser-Meily, J. M., Kappelle, L. J., de Kort, P. L., Huisman, K. D., Eijsackers, A. L., et al. (2013). Exploring near and far regions of space: Distance-specific visuospatial neglect after stroke. Journal of Clinical and Experimental Neuropsychology, 35, 799811.Google Scholar
van der Wal, R. C., & Van Dillen, L. F. (2013). Leaving a flat taste in your mouth: Task load reduces taste perception. Psychological Science 24, 1277–84.Google Scholar
van Elk, M., Forget, J., & Blanke, O. (2013). The effect of limb crossing and limb congruency on multisensory integration in peripersonal space for the upper and lower extremities. Consciousness and Cognition, 22, 545–55.Google Scholar
Van Wassenhove, V., Grant, K. W., & Poeppel, D. (2005). Visual speech speeds up the neural processing of auditory speech. Proceedings of the National Academy of Sciences of the USA, 102, 1181–6.Google Scholar
Vibell, J., Klinge, C., Zampini, M., Nobre, A. C., & Spence, C. (2017). Differences between endogenous attention to spatial locations and sensory modalities. Experimental Brain Research, 19, 109–20.Google Scholar
Vibell, J., Klinge, C., Zampini, M., Spence, C., & Nobre, A. C. (2007). Temporal order is coded temporally in the brain: Early ERP latency shifts underlying prior entry in a crossmodal temporal order judgment task. Journal of Cognitive Neuroscience, 19, 109–20.Google Scholar
Vroomen, J., Bertelson, P., & de Gelder, B. (2001a). The ventriloquist effect does not depend on the direction of automatic visual attention. Perception & Psychophysics, 63, 651–9.Google Scholar
Vroomen, J., Bertelson, P., & de Gelder, B. (2001b). Directing spatial attention towards the illusory location of a ventriloquized sound. Acta Psychologica, 108, 2133.Google Scholar
Wahn, B., Keshava, A., Sinnett, S., Kingstone, A., & König, P. (2017). Audiovisual integration is affected by performing a task jointly. Proceedings of the 39th Annual Conference of the Cognitive Science Society (Austin, TX), 12961301.Google Scholar
Wahn, B., & König, P. (2015a). Audition and vision share spatial attentional resources, yet attentional load does not disrupt audiovisual integration. Frontiers in Psychology, 6, 1084. http://doi.org/10.3389/fpsyg.2015.01084Google Scholar
Wahn, B., & König, P. (2015b). Vision and haptics share spatial attentional resources and visuotactile integration is not affected by high attentional load. Multisensory Research, 28, 371–92. http://doi.org/10.1163/22134808-00002482Google Scholar
Wahn, B., & König, P. (2016). Attentional resource allocation in visuotactile processing depends on the task, but optimal visuotactile integration does not depend on attentional resources. Frontiers in Integrative Neuroscience, 10, 13.Google Scholar
Wahn, B., & König, P. (2017). Is attentional resource allocation across sensory modalities task-dependent? Advances in Cognitive Psychology, 13, 8396. http://doi.org/10.5709/acp-0209-2Google Scholar
Wahn, B., Murali, S., Sinnett, S., & König, P. (2017). Auditory stimulus detection partially depends on visuospatial attentional resources. i-Perception, 8, 117. http://doi.org/10.1177/2041669516688026Google Scholar
Wang, L., Yue, Z., & Chen, Q. (2012). Cross-modal nonspatial repetition inhibition. Attention, Perception, & Psychophysics, 74, 867–78.Google Scholar
Ward, L. M. (1994). Supramodal and modality-specific mechanisms for stimulus-driven shifts of auditory and visual attention. Canadian Journal of Experimental Psychology, 48, 242–59.Google Scholar
Ward, L. M., McDonald, J. A., & Golestani, N. (1998). Cross-modal control of attention shifts. In Wright, R. (Ed.), Visual attention (pp. 232–68). New York, NY: Oxford University Press.Google Scholar
Ward, L. M., McDonald, J. J., & Lin, D. (2000). On asymmetries in cross-modal spatial attention orienting. Perception & Psychophysics, 62, 1258–64.Google Scholar
Warm, J. S., Dember, W. N., & Parasuraman, R. (1991). Effects of olfactory stimulation on performance and stress in a visual sustained attention task. Journal of the Society of Cosmetic Chemists, 42, 199210.Google Scholar
Watt, R. J. (1991). Understanding vision. London, UK: Academic Press.Google Scholar
Weidler, B., & Abrams, R. A. (2014). Enhanced cognitive control near the hands. Psychonomic Bulletin & Review, 21, 462–9.Google Scholar
Welford, A. T. (1952). The ‘psychological refractory period’ and the timing of high-speed performance – A review and a theory. British Journal of Psychology, 43, 219.Google Scholar
Wickens, C. D. (1984). Processing resources in attention. In Parasuraman, R. & Davies, D. R. (Eds.), Varieties of attention (pp. 63–102). San Diego, CA: Academic Press.Google Scholar
Wickens, C. D. (1992). Engineering psychology and human performance (2nd ed.). New York, NY: HarperCollins.Google Scholar
Wickens, C. D. (2002). Multiple resources and performance prediction. Theoretical Issues in Ergonomics Science, 3, 159–77.Google Scholar
Witt, J. K., Proffitt, D. R., & Epstein, W. (2005). Tool use affects perceived distance, but only when you intend to use it. Journal of Experimental Psychology: Human Perception and Performance, 31, 880–8.Google Scholar
Wogalter, M. S., Kalsher, M. J., & Racicot, B. M. (1993). Behavioral compliance with warnings: Effects of voice, context, and location. Safety Science, 16, 637–54.Google Scholar
Wolfe, J. M., Horowitz, T. S., & Kenner, N. M. (2005). Rare items often missed in visual searches. Nature, 435, 439–40. http://doi.org/10.1038/435439aGoogle Scholar
Woodrow, H. (1914). The measurement of attention. The Psychological Monographs, 17, i158. http://doi.org/10.1037/h0093087Google Scholar
Wozny, D. R., Beierholm, U. R., & Shams, L. (2008). Human trimodal perception follows optimal statistical inference. Journal of Vision, 8, 24, 111.Google Scholar
Wu, C.-C., Wick, F. A., & Pomplun, M. (2014). Guidance of visual attention by semantic information in real-world scenes. Frontiers in Psychology, 5, 54. http://doi.org/10.3389/fpsyg.2014.00054Google Scholar
Wu, J., Li, Q., Bai, O., & Touge, T. (2009). Multisensory interactions elicited by audiovisual stimuli presented peripherally in a visual attention task: A behavioral and event-related potential study in humans. Journal of Clinical Neurophysiology, 26, 407–13.Google Scholar
Yamamoto, S., & Kitazawa, S. (2001). Reversal of subjective temporal order due to arm crossing. Nature Neuroscience, 4, 759–65.Google Scholar
Yue, Z., Bischof, G.-N., Zhou, X., Spence, C., & Röder, B. (2009). Spatial attention affects the processing of tactile and visual stimuli presented at the tip of a tool: An event-related potential study. Experimental Brain Research, 193, 119–28.Google Scholar
Yue, Z., Jiang, Y., Li, Y., Wang, P., & Chen, Q. (2015). Enhanced visual dominance in far space. Experimental Brain Research, 233, 2833–43.Google Scholar
Zampini, M., Guest, S., Shore, D. I., & Spence, C. (2005). Audiovisual simultaneity judgments. Perception & Psychophysics, 67, 531–44.Google Scholar
Zampini, M., Torresan, D., Spence, C., & Murray, M. M. (2007). Audiotactile multisensory interactions in front and rear space. Neuropsychologia, 45, 186977.Google Scholar
Zangenehpour, S., & Zatorre, R. J. (2010). Cross-modal recruitment of primary visual cortex following brief exposure to bimodal audiovisual stimuli. Neuropsychologia, 48, 591600.Google Scholar
Zimmer, U., & Macaluso, E. (2007). Processing of multisensory spatial congruence can be dissociated from working memory and visuo-spatial attention. European Journal of Neuroscience, 26, 1681–91.Google Scholar
Zmigrod, S., Spapé, M., & Hommel, B. (2009). Intermodal event files: Integrating features across vision, audition, taction, and action. Psychological Research, 73, 674–84.Google Scholar
Zou, H., Müller, H. J., & Shi, Z. (2012). Non-spatial sounds regulate eye movements and enhance visual search. Journal of Vision, 12, 118.Google Scholar

Save element to Kindle

To save this element to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Crossmodal Attention Applied
Available formats
×

Save element to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Crossmodal Attention Applied
Available formats
×

Save element to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Crossmodal Attention Applied
Available formats
×