Show Reference: "Action-Driven Perception for a Humanoid"

Action-Driven Perception for a Humanoid In Agents and Artificial Intelligence, Vol. 358 (2013), pp. 83-99, doi:10.1007/978-3-642-36907-0_6 by Jens Kleesiek, Stephanie Badde, Stefan Wermter, Andreas K. Engel edited by Joaquim Filipe, Ana Fred
@incollection{kleesiek-et-al-2013,
    abstract = {We present active object categorization experiments with a real humanoid robot. For this purpose, the training algorithm of a recurrent neural network with parametric bias has been extended with adaptive learning rates. This modification leads to an increase in training speed. Using this new training algorithm we conducted three experiments aiming at object categorization. While holding different objects in its hand, the robot executes a motor sequence that induces multi-modal sensory changes. During learning, these high-dimensional perceptions are 'engraved' in the network. Simultaneously, low-dimensional {PB} values emerge unsupervised. The geometrical relation of these {PB} vectors can then be exploited to infer relations between the original high dimensional time series characterizing different objects. Even sensations belonging to unknown objects can be discriminated from known (learned) ones and kept apart from each other reliably. Additionally, we show that the network tolerates noisy sensory signals very well.},
    author = {Kleesiek, Jens and Badde, Stephanie and Wermter, Stefan and Engel, Andreas K.},
    booktitle = {Agents and Artificial Intelligence},
    doi = {10.1007/978-3-642-36907-0\_6},
    editor = {Filipe, Joaquim and Fred, Ana},
    keywords = {active-perception, embodiment, neurorobotics, robotics},
    pages = {83--99},
    posted-at = {2014-05-15 11:36:49},
    priority = {2},
    publisher = {Springer Berlin Heidelberg},
    series = {Communications in Computer and Information Science},
    title = {{Action-Driven} Perception for a Humanoid},
    url = {http://dx.doi.org/10.1007/978-3-642-36907-0\_6},
    volume = {358},
    year = {2013}
}

See the CiteULike entry for more info, PDF links, BibTex etc.