Show Reference: "Multisensor Data Fusion"

Multisensor Data Fusion In Springer Handbook of Robotics (2008), pp. 585-610, doi:10.1007/978-3-540-30301-5_26 by Hugh Durrant-Whyte, Thomas C. Henderson edited by Bruno Siciliano, Oussama Khatib
@incollection{durrant-whyte-and-henderson-2008,
    abstract = {Multisensor data fusion  is the process of combining observations from a number of different sensors to provide a robust and complete description of an environment or process of interest. Data fusion finds wide application in many areas of robotics such as object recognition, environment mapping, and localization.   This chapter has three parts: methods, architectures, and applications. Most current data fusion methods employ probabilistic descriptions of observations and processes and use Bayesʼ rule to combine this information. This chapter surveys the main probabilistic modeling and fusion techniques including grid-based models, Kalman filtering, and sequential Monte Carlo techniques. This chapter also briefly reviews a number of nonprobabilistic data fusion methods. Data fusion systems are often complex combinations of sensor devices, processing, and fusion algorithms. This chapter provides an overview of key principles in data fusion architectures from both a hardware and algorithmic viewpoint. The applications of data fusion are pervasive in robotics and underly the core problem of sensing, estimation, and perception. We highlight two example applications that bring out these features. The first describes a navigation or self-tracking application for an autonomous vehicle. The second describes an application in mapping and environment modeling.   The essential algorithmic tools of data fusion are reasonably well established. However, the development and use of these tools in realistic robotics applications is still developing.},
    address = {Berlin, Heidelberg},
    author = {Durrant-Whyte, Hugh and Henderson, Thomas C.},
    booktitle = {Springer Handbook of Robotics},
    chapter = {26},
    citeulike-article-id = {7228036},
    citeulike-linkout-0 = {http://dx.doi.org/10.1007/978-3-540-30301-5\_26},
    citeulike-linkout-1 = {http://www.springerlink.com/content/u6435370m3l2hn80},
    citeulike-linkout-2 = {http://link.springer.com/referenceworkentry/10.1007/978-3-540-30301-5\_26},
    doi = {10.1007/978-3-540-30301-5\_26},
    editor = {Siciliano, Bruno and Khatib, Oussama},
    isbn = {978-3-540-23957-4},
    keywords = {multisensory-integration, robotics},
    pages = {585--610},
    posted-at = {2014-12-04 08:58:52},
    priority = {2},
    publisher = {Springer Berlin Heidelberg},
    title = {Multisensor Data Fusion},
    url = {http://dx.doi.org/10.1007/978-3-540-30301-5\_26},
    year = {2008}
}

See the CiteULike entry for more info, PDF links, BibTex etc.

The Kalman filter is a good method in many (robotic) multisensory integration problems in dynamic domains.

At the most general level, multisensory integration (or multisensor data fusion) in application contexts is best described in terms of Bayesian theory, its specializations, and approximations to it.