Show Reference: "Unifying multisensory signals across time and space"

Unifying multisensory signals across time and space Experimental Brain Research, Vol. 158, No. 2. (September 2004), pp. 252-258, doi:10.1007/s00221-004-1899-9 by Mark T. Wallace, Gwendolyn E. Roberson, W. David Hairston, et al.
    abstract = {The brain integrates information from multiple sensory modalities and, through this process, generates a coherent and apparently seamless percept of the external world. Although multisensory integration typically binds information that is derived from the same event, when multisensory cues are somewhat discordant they can result in illusory percepts such as the "ventriloquism effect." These biases in stimulus localization are generally accompanied by the perceptual unification of the two stimuli. In the current study, we sought to further elucidate the relationship between localization biases, perceptual unification and measures of a participant's uncertainty in target localization (i.e., variability). Participants performed an auditory localization task in which they were also asked to report on whether they perceived the auditory and visual stimuli to be perceptually unified. The auditory and visual stimuli were delivered at a variety of spatial (0 degrees, 5 degrees, 10 degrees, 15 degrees ) and temporal (200, 500, 800 ms) disparities. Localization bias and reports of perceptual unity occurred even with substantial spatial (i.e., 15 degrees ) and temporal (i.e., 800 ms) disparities. Trial-by-trial comparison of these measures revealed a striking correlation: regardless of their disparity, whenever the auditory and visual stimuli were perceived as unified, they were localized at or very near the light. In contrast, when the stimuli were perceived as not unified, auditory localization was often biased away from the visual stimulus. Furthermore, localization variability was significantly less when the stimuli were perceived as unified. Intriguingly, on non-unity trials such variability increased with decreasing disparity. Together, these results suggest strong and potentially mechanistic links between the multiple facets of multisensory integration that contribute to our perceptual Gestalt.},
    address = {Department of Neurobiology and Anatomy, Wake Forest University School of Medicine, Winston-Salem, NC 27157, USA.},
    author = {Wallace, Mark T. and Roberson, Gwendolyn E. and Hairston, W. David and Stein, Barry E. and Vaughan, J. William and Schirillo, James A.},
    doi = {10.1007/s00221-004-1899-9},
    issn = {0014-4819},
    journal = {Experimental Brain Research},
    keywords = {biology, multisensory-integration, ventriloquism-effect},
    month = sep,
    number = {2},
    pages = {252--258},
    pmid = {15112119},
    posted-at = {2014-05-23 13:13:25},
    priority = {2},
    title = {Unifying multisensory signals across time and space},
    url = {},
    volume = {158},
    year = {2004}

See the CiteULike entry for more info, PDF links, BibTex etc.

Two stimuli in different modalities are perceived as one multi-sensory stimulus if the position in space and point time at which they are presented are not too far apart.

Semantical congruence can influence multisensory integration.

With increasing distance between stimuli in different modalities, the likelihood of perceiving them as in one location decreases.

With increasing distance between stimuli in different modalities, the likelihood of perceiving them as one cross-modal stimulus decreases.

In other words, the unity assumption depends on the distance between stimuli.

In an audio-visual localization task, Wallace et al. found that their subjects' localization of the auditory stimulus were usually biased towards the visual stimulus whenever the two stimuli were perceived as one and vice-versa.

Details of instructions and quality of stimuli can influence the strength of the spatial ventriloquism effect.