Show Reference: "Resolving multisensory conflict: a strategy for balancing the costs and benefits of audio-visual integration."

Resolving multisensory conflict: a strategy for balancing the costs and benefits of audio-visual integration. Proceedings of the Royal Society B: Biological Sciences, Vol. 273, No. 1598. (7 September 2006), pp. 2159-2168, doi:10.1098/rspb.2006.3578 by Neil W. Roach, James Heron, Paul V. McGraw
@article{roach-et-al-2006,
    abstract = {In order to maintain a coherent, unified percept of the external environment, the brain must continuously combine information encoded by our different sensory systems. Contemporary models suggest that multisensory integration produces a weighted average of sensory estimates, where the contribution of each system to the ultimate multisensory percept is governed by the relative reliability of the information it provides (maximum-likelihood estimation). In the present study, we investigate interactions between auditory and visual rate perception, where observers are required to make judgments in one modality while ignoring conflicting rate information presented in the other. We show a gradual transition between partial cue integration and complete cue segregation with increasing inter-modal discrepancy that is inconsistent with mandatory implementation of maximum-likelihood estimation. To explain these findings, we implement a simple Bayesian model of integration that is also able to predict observer performance with novel stimuli. The model assumes that the brain takes into account prior knowledge about the correspondence between auditory and visual rate signals, when determining the degree of integration to implement. This provides a strategy for balancing the benefits accrued by integrating sensory estimates arising from a common source, against the costs of conflating information relating to independent objects or events.},
    address = {Visual Neuroscience Group, School of Psychology, The University of Nottingham, Nottingham NG7 2RD, UK. nwr@psychology.nottingham.ac.uk},
    author = {Roach, Neil W. and Heron, James and McGraw, Paul V.},
    citeulike-article-id = {1146055},
    citeulike-linkout-0 = {http://dx.doi.org/10.1098/rspb.2006.3578},
    citeulike-linkout-1 = {http://view.ncbi.nlm.nih.gov/pubmed/16901835},
    citeulike-linkout-2 = {http://www.hubmed.org/display.cgi?uids=16901835},
    day = {7},
    doi = {10.1098/rspb.2006.3578},
    issn = {0962-8452},
    journal = {Proceedings of the Royal Society B: Biological Sciences},
    keywords = {causal-inference, multisensory-integration},
    month = sep,
    number = {1598},
    pages = {2159--2168},
    pmid = {16901835},
    posted-at = {2014-08-26 15:09:24},
    priority = {2},
    title = {Resolving multisensory conflict: a strategy for balancing the costs and benefits of audio-visual integration.},
    url = {http://dx.doi.org/10.1098/rspb.2006.3578},
    volume = {273},
    year = {2006}
}

See the CiteULike entry for more info, PDF links, BibTex etc.

Usually, rate perception is influenced more strongly by auditory information than by visual information.

By modulating the reliability of auditory information, visual information can be given greater weight in rate perception.

Roach et al. present a Bayesian model of multisensory integration which takes into account the fact that information from different modalities is only integrated up to a certain amount of incongruence. That model incorporates a Gaussian prior on distances between actual components in cross-sensory stimuli.

With increasing distance between stimuli in different modalities, the likelihood of perceiving them as in one location decreases.