首页> 外文会议> >Orthogonal functional basis neural network for functional approximation
【24h】

Orthogonal functional basis neural network for functional approximation

机译:正交泛函神经网络进行函数逼近

获取原文

摘要

Subset selection is a well-known technique for generating an efficient and effective neural network structure. The technique has been combined with regularization to improve the generalization performance of a neural network. In this paper, we show an incongruity involving subset selection and regularization. We present an approach to solve this dissonance wherein our subset selection is derived from a combination of functional basis. A more efficient training convergence speed is shown using the new basis which is derived from an 'orthogonal-functional-basis' transformation. With this transformation we propose a new orthogonal functional basis neural network structure which is not only more computationally tractable but also gives better generalization performance. Simulation studies are presented that demonstrate the performance, behavior, and advantages of the proposed network.
机译:子集选择是一种众所周知的技术,用于生成高效的神经网络结构。该技术已与正则化相结合,以提高神经网络的泛化性能。在本文中,我们显示了涉及子集选择和正则化的不一致性。我们提出了一种解决这种不协调的方法,其中我们的子集选择是从功能基础的组合中得出的。使用从“正交功能基础”变换中得出的新基础,可以显示出更有效的训练收敛速度。通过这种变换,我们提出了一种新的正交功能基神经网络结构,该结构不仅在计算上更易于处理,而且具有更好的泛化性能。仿真研究表明了所提出的网络的性能,行为和优势。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号