Access count of this item: 312

Files in This Item:
File Description SizeFormat 
j.patrec.2010.05.002.pdf433.16 kBAdobe PDFView/Open
Title: Inter-modality mapping in robot with recurrent neural network
Authors: Ogata, Tetsuya
Nishide, Shun  KAKEN_id
Kozima, Hideki
Komatani, Kazunori
Okuno, Hiroshi G.
Author's alias: 尾形, 哲也
Keywords: Dynamical systems
Inter-modal mapping
Recurrent neural network with parametric bias
Issue Date: 1-Sep-2010
Publisher: Elsevier B.V.
Journal title: Pattern Recognition Letters
Volume: 31
Issue: 12
Start page: 1560
End page: 1569
Abstract: A system for mapping between different sensory modalities was developed for a robot system to enable it to generate motions expressing auditory signals and sounds generated by object movement. A recurrent neural network model with parametric bias, which has good generalization ability, is used as a learning model. Since the correspondences between auditory signals and visual signals are too numerous to memorize, the ability to generalize is indispensable. This system was implemented in the “Keepon” robot, and the robot was shown horizontal reciprocating or rotating motions with the sound of friction and falling or overturning motion with the sound of collision by manipulating a box object. Keepon behaved appropriately not only from learned events but also from unknown events and generated various sounds in accordance with observed motions.
Rights: © 2010 Elsevier B.V.
This is not the published version. Please cite only the published version. この論文は出版社版でありません。引用の際には出版社版をご確認ご利用ください。
DOI(Published Version): 10.1016/j.patrec.2010.05.002
Appears in Collections:Journal Articles

Show full item record

Export to RefWorks

Export Format: 

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.