ダウンロード数: 198
このアイテムのファイル:
ファイル | 記述 | サイズ | フォーマット | |
---|---|---|---|---|
transfun.E101.A.1092.pdf | 9.48 MB | Adobe PDF | 見る/開く |
タイトル: | Efficient Mini-batch Training on Memristor Neural Network Integrating Gradient Calculation and Weight Update |
著者: | YAMAMORI, Satoshi HIROMOTO, Masayuki SATO, Takashi https://orcid.org/0000-0002-1577-8259 (unconfirmed) |
著者名の別形: | 廣本, 正之 佐藤, 高史 |
キーワード: | memoristor neural network mini-batch training stochastic gradient descent |
発行日: | 1-Jul-2018 |
出版者: | 電子情報通信学会 |
誌名: | IEICE Transactions of Fundamentals on Electronics, Communications and Computer Sciences |
巻: | E101-A |
号: | 7 |
開始ページ: | 1092 |
終了ページ: | 1100 |
抄録: | We propose an efficient training method for memristor neural networks. The proposed method is suitable for the mini-batch-based training, which is a common technique for various neural networks. By integrating the two processes of gradient calculation in the backpropagation algorithm and weight update in the write operation to the memristors, the proposed method accelerates the training process and also eliminates the external computing resources required in the existing method, such as multipliers and memories. Through numerical experiments, we demonstrated that the proposed method achieves twice faster convergence of the training process than the existing method, while retaining the same level of the accuracy for the classification results. |
著作権等: | © 2018 The Institute of Electronics, Information and Communication Engineers 許諾条件に基づいて掲載しています。 |
URI: | http://hdl.handle.net/2433/242229 |
DOI(出版社版): | 10.1587/transfun.E101.A.1092 |
出現コレクション: | 学術雑誌掲載論文等 |
このリポジトリに保管されているアイテムはすべて著作権により保護されています。