首页> 外文会议>International conference on artificial neural networks >Integration of Unsupervised and Supervised Criteria for Deep Neural Networks Training
【24h】

Integration of Unsupervised and Supervised Criteria for Deep Neural Networks Training

机译:深度神经网络训练中无监督和受监督标准的集成

获取原文

摘要

Training Deep Neural Networks has been a difficult task for a long time. Recently diverse approaches have been presented to tackle these difficulties, showing that deep models improve the performance of shallow ones in some areas like signal processing, signal classification or signal segmentation, whatever type of signals, e.g. video, audio or images. One of the most important methods is greedy layer-wise unsupervised pre-training followed by a fine-tuning phase. Despite the advantages of this procedure, it does not fit some scenarios where real time learning is needed, as for adaptation of some time-series models. This paper proposes to couple both phases into one, modifying the loss function to mix together the unsupervised and supervised parts. Benchmark experiments with MNIST database prove the viability of the idea for simple image tasks, and experiments with time-series forecasting encourage the incorporation of this idea into on-line learning approaches. The interest of this method in time-series forecasting is motivated by the study of predictive models for domotic houses with intelligent control systems.
机译:长期以来,训练深度神经网络一直是一项艰巨的任务。为了解决这些困难,最近提出了多种多样的方法,表明深层模型在诸如信号处理,信号分类或信号分段之类的某些领域中改善了浅层模型的性能,而无论信号的类型如何。视频,音频或图像。最重要的方法之一是贪婪的逐层无监督预训练,然后进行微调。尽管此过程有很多优点,但它不适用于某些需要实时学习的场景,例如某些时间序列模型的适应。本文提出将两个阶段耦合为一个,修改损失函数以将无监督和受监督的部分混合在一起。利用MNIST数据库进行的基准实验证明了该想法适用于简单图像任务的可行性,而采用时间序列预测的实验则鼓励将该想法纳入在线学习方法中。该方法在时间序列预测中的兴趣是通过研究具有智能控制系统的住宅房屋的预测模型来激发的。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号