首页> 外文会议>Conference on substance identification analytics >Focusing attention in hierarchical neural networks
【24h】

Focusing attention in hierarchical neural networks

机译:重点关注分层神经网络

获取原文

摘要

This paper presents a new model for focusing attention in hierarchical structured neural networks. Emphasis is devoted to determine the location of the focus of attention. The main idea is that attention is closely coupled with predictions about the environment. Whenever there is a mismatch between prediction and reality a shift of attention is performed. This mismatch can also be used to change (learn) the prediction and processing mechanism, so that the prediction will be better next time. In this sense attention and learning are closely coupled. We present a first application of this mechanism to classification of satellite image (Landsat TM) data. The usage of the attentional mechanism can reduce the processing time by 50% while maintaining the classification accuracy.
机译:本文介绍了一种重点关注分层结构神经网络的新模型。重点致力于确定关注的焦点位置。主要思想是,注意力与对环境的预测紧密相连。每当预测和现实之间存在不匹配时,请执行注意的偏移。这种不匹配也可以用于改变(学习)预测和处理机制,使得下次预测将更好。在这个意义上,关注和学习密切联系。我们在卫星图像(LANDSAT TM)数据的分类中展示了这种机制。注意力机制的使用可以将处理时间减少50%,同时保持分类精度。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号