首页> 外文会议>International Joint Conference on Neural Networks >Unsupervised Pre-training on Improving the Performance of Neural Network in Regression
【24h】

Unsupervised Pre-training on Improving the Performance of Neural Network in Regression

机译:提高回归中神经网络性能的无监督培训

获取原文
获取外文期刊封面目录资料

摘要

The paper aims to empirically analyse the performance of the prediction capability of Artificial Neural Network by applying a pre-training mechanism. The pre-training used here is same as the training of Deep Belief Network where the network is formed by stacking Restricted Boltzmann Machine one above the other successively. A different set of experiments are performed to understand in what scenario pre-trained ANN performed better than randomly initialised ANN. The results of experiments showed that pre-trained model performed better than randomly initialised ANN in terms of generalised error, computational units required and most importantly robust to change in hyperparameters such as learning rate and model architecture. The only cost is in additional time involved in the pre-training phase. Further, the learned knowledge in pretraining, which is stored as weights in ANN, are analysed using Hinton diagram. The analysis could provide clear picture of the pre-training that learned some of the hidden characteristics of the data.
机译:本文旨在通过应用预训练机制来凭经验分析人工神经网络预测能力的性能。这里使用的预先训练与对网络形成的深度信仰网络的训练相同,其中通过连续地堆叠限制的Boltzmann机器。进行不同的实验,以了解在培训的预先训练的植物的哪些方案比随机初始化的ANN更好。实验结果表明,在广义误差方面,预先训练的模型比随机初始化的ANN更好,所需的计算单元以及最重要的是在诸如学习率和模型架构中更改的最重要的变化。唯一的成本是额外时间参与训练阶段。此外,使用HINTON图分析以预先预测的预先存储作为权重的知识。分析可以提供清晰的预训练图片,从而了解了数据的一些隐藏特征。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号