ダウンロード数: 496
このアイテムのファイル:
ファイル | 記述 | サイズ | フォーマット | |
---|---|---|---|---|
j.patrec.2010.05.002.pdf | 433.16 kB | Adobe PDF | 見る/開く |
タイトル: | Inter-modality mapping in robot with recurrent neural network |
著者: | Ogata, Tetsuya Nishide, Shun Kozima, Hideki Komatani, Kazunori Okuno, Hiroshi G. |
著者名の別形: | 尾形, 哲也 |
キーワード: | Dynamical systems Inter-modal mapping Recurrent neural network with parametric bias Generalization |
発行日: | 1-Sep-2010 |
出版者: | Elsevier B.V. |
誌名: | Pattern Recognition Letters |
巻: | 31 |
号: | 12 |
開始ページ: | 1560 |
終了ページ: | 1569 |
抄録: | A system for mapping between different sensory modalities was developed for a robot system to enable it to generate motions expressing auditory signals and sounds generated by object movement. A recurrent neural network model with parametric bias, which has good generalization ability, is used as a learning model. Since the correspondences between auditory signals and visual signals are too numerous to memorize, the ability to generalize is indispensable. This system was implemented in the “Keepon” robot, and the robot was shown horizontal reciprocating or rotating motions with the sound of friction and falling or overturning motion with the sound of collision by manipulating a box object. Keepon behaved appropriately not only from learned events but also from unknown events and generated various sounds in accordance with observed motions. |
著作権等: | © 2010 Elsevier B.V. この論文は出版社版でありません。引用の際には出版社版をご確認ご利用ください。 This is not the published version. Please cite only the published version. |
URI: | http://hdl.handle.net/2433/128984 |
DOI(出版社版): | 10.1016/j.patrec.2010.05.002 |
出現コレクション: | 学術雑誌掲載論文等 |
このリポジトリに保管されているアイテムはすべて著作権により保護されています。