...
首页> 外文期刊>WSEAS Transactions on Mathematics >A Functional Approximation Comparison between Neural Networks and Polynomial Regression
【24h】

A Functional Approximation Comparison between Neural Networks and Polynomial Regression

机译:神经网络和多项式回归之间的函数逼近比较

获取原文
获取原文并翻译 | 示例
           

摘要

Multi-layered perceptron (MLP) neural networks are well known as universal approximators. They are often used as estimation tools in place of the classical statistical methods. The focus of this study is to compare the approximation ability of MLP with a traditional statistical regression model, namely the polynomial regression. Comparison among the single hidden layer MLP, double hidden layer MLP and polynomial regression is carried out on the basis of similar number of weights or parameters. The performance of these three categories is measured using fraction of variance unexplained (FVU). The closer the FVU value is to zero, the better the estimation result and this is associated with a higher degree of accuracy. From the empirical results obtained in this study, we conclude that overall polynomial regression performs slightly better than MLP for a similar number of parameter except for the complicated interaction function. Meanwhile, double hidden layer MLP outperforms single hidden layer MLP. The MLP is more appropriate than the polynomial regression in approximating the complicated interaction function.
机译:多层感知器(MLP)神经网络被称为通用逼近器。它们通常代替传统的统计方法用作估算工具。这项研究的重点是将MLP的逼近能力与传统的统计回归模型(即多项式回归)进行比较。单隐藏层MLP,双隐藏层MLP和多项式回归之间的比较是基于相似数量的权重或参数进行的。使用无法解释的方差分数(FVU)来衡量这三个类别的性能。 FVU值越接近零,估计结果越好,这与更高的准确度有关。从本研究中获得的经验结果,我们得出结论,对于类似数量的参数,除了复杂的交互函数之外,整体多项式回归的性能略好于MLP。同时,双重隐藏层MLP优于单一隐藏层MLP。在逼近复杂的相互作用函数时,MLP比多项式回归更合适。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号