首页> 外文期刊>Neural computation >TAG: A Neural Network Model for Large-Scale Optical Implementation
【24h】

TAG: A Neural Network Model for Large-Scale Optical Implementation

机译:TAG:大规模光学实现的神经网络模型

获取原文
获取原文并翻译 | 示例

摘要

TAG (Training by Adaptive Gain) is a new adaptive learning algorithm developed for optical implementation of large-scale artificial neural networks. For fully interconnected single-layer neural networks with N input and M output neurons TAG contains two different types of interconnections, i.e., M N global fixed interconnections and N + M adaptive gain controls. For two-dimensional input patterns the former may be achieved by multifacet holograms, and the latter by spatial light modulators (SLMs). For the same number of input and output neurons TAG requires much less adaptive elements, and provides a possibility for large-scale optical implementation at some sacrifice in performance as compared to the perceptron. The training algorithm is based on gradient descent and error backpropagation, and is easily extensible to multilayer architecture. Computer simulation demonstrates reasonable performance of TAG compared to perceptron performance. An electrooptical implementation of TAG is also proposed.
机译:TAG(自适应增益训练)是一种新的自适应学习算法,用于大规模人工神经网络的光学实现。对于具有N个输入和M个输出神经元的完全互连的单层神经网络,TAG包含两种不同类型的互连,即M N个全局固定互连和N + M个自适应增益控制。对于二维输入模式,前者可以通过多面全息图实现,而后者可以通过空间光调制器(SLM)实现。对于相同数量的输入和输出神经元,TAG需要的适应性元件要少得多,并且与感知器相比,在性能上有所牺牲的情况下,可以大规模实施光学。该训练算法基于梯度下降和误差反向传播,并且很容易扩展到多层体系结构。计算机仿真表明,与感知器性能相比,TAG具有合理的性能。还提出了TAG的电光实现。

著录项

  • 来源
    《Neural computation》 |1991年第1期|135-143|共9页
  • 作者

    Lee H; Lee S; Shin S; Koh B;

  • 作者单位

    Department of Electrical Engineering, Korea Advanced Institute of Science and Technology, P.O. Box 150 Chongryangni, Seoul, Korea;

  • 收录信息 美国《科学引文索引》(SCI);美国《化学文摘》(CA);
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号