首页> 外文期刊>Pattern recognition letters >Combination of supervised and unsupervised learning for training the activation functions of neural networks
【24h】

Combination of supervised and unsupervised learning for training the activation functions of neural networks

机译:有监督和无监督学习相结合来训练神经网络的激活功能

获取原文
获取原文并翻译 | 示例
           

摘要

Standard feedforward neural networks benefit from the nice theoretical properties of mixtures of sigmoid activation functions, but they may fail in several practical learning tasks. These tasks would be better faced by relying on a more appropriate, problem-specific basis of activation functions. The paper presents a connectionist model which exploits adaptive activation functions. Each hidden unit in the network is associated with a specific pair (f(·).p(·)), where f(·) is the activation function and p(·) is the likelihood of the unit being relevant to the computation of the network output over the current input. The function f(·) is optimized in a supervised manner, while p(·) is realized via a statistical parametric model learned through unsupervised (or, partially supervised) estimation. Since f(·) and p(·) influence each other's learning process, the overall machine is implicitly a co-trained coupled model and, in turn, a flexible, non-standard neural architecture. Feasibility of the approach is corroborated by empirical evidence yielded by computer simulations involving regression and classification tasks.
机译:标准前馈神经网络得益于S型激活函数混合的良好理论特性,但它们可能无法完成一些实际的学习任务。依靠更适当的,特定于问题的激活功能基础可以更好地应对这些任务。本文提出了一种利用自适应激活功能的连接主义模型。网络中的每个隐藏单元都与一个特定的对(f(·).p(·))相关联,其中f(·)是激活函数,p(·)是该单元与计算的相关性的可能性当前输入的网络输出。函数f(·)以有监督的方式进行了优化,而p(·)是通过统计参数模型实现的,该统计参数模型是通过无监督(或部分受监督)的估计而获得的。由于f(·)和p(·)会影响彼此的学习过程,因此整个机器隐含地是一个共同训练的耦合模型,进而是一种灵活的非标准神经体系结构。该方法的可行性得到涉及回归和分类任务的计算机模拟的经验证据的证实。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号