Towards Intelligent Social Robots: Current Advances in Cognitive Robotics Proceedings of the Full Day Workshop Towards Intelligent Social Robots: Current Advances in Cognitive Robotics in Conjunction with Humanoids 2015 Towards Emerging Multimodal Cognitive Representations from Neural Self-Organization
release_gznyzbn64nbt5ab7x75rxtx65u
by
Amir Aly, Sascha Griffiths, Francesca Stramandinoli, Amir Aly, Sascha Griffiths, Francesca, South Korea, Amir Aly, Sascha Griffiths, Francesca Stramandinoli, German Parisi, Cornelius Weber (+1 others)
2015
Abstract
The integration of multisensory information plays a crucial role in autonomous robotics. In this work, we investigate how robust multimodal representations can naturally develop in a self-organized manner from co-occurring multisensory inputs. We propose a hierarchical learning architecture with growing self-organizing neural networks for learning human actions from audiovisual inputs. Associative links between unimodal representations are incrementally learned by a semi-supervised algorithm with bidirectional connectivity that takes into account inherent spatiotemporal dynamics of the input. Experiments on a dataset of 10 full-body actions show that our architecture is able to learn action-word mappings without the need of segmenting training samples for ground-truth labelling. Instead, multimodal representations of actions are obtained using the co-activation of action features from video sequences and labels from automatic speech recognition. Promising experimental results encourage the extension of our architecture in several directions.
In text/plain
format
Archived Files and Locations
application/pdf 11.1 MB
file_xorlzuvnv5f7rdrrxg46jlfvmi
|
web.archive.org (webarchive) hal.archives-ouvertes.fr (web) |
article-journal
Stage
unknown
Year 2015
access all versions, variants, and formats of this works (eg, pre-prints)