首页> 外文期刊>Journal of machine learning research >MLPs (Mono-Layer Polynomials and Multi-Layer Perceptrons) for Nonlinear Modeling
【24h】

MLPs (Mono-Layer Polynomials and Multi-Layer Perceptrons) for Nonlinear Modeling

机译:用于非线性建模的MLP(单层多项式和多层感知器)

获取原文
       

摘要

This paper presents a model selection procedure which stresses the importance of the classic polynomial models as tools for evaluating the complexity of a given modeling problem, and for removing non-significant input variables. If the complexity of the problem makes a neural network necessary, the selection among neural candidates can be performed in two phases. In an additive phase, the most important one, candidate neural networks with an increasing number of hidden neurons are trained. The addition of hidden neurons is stopped when the effect of the round-off errors becomes significant, so that, for instance, confidence intervals cannot be accurately estimated. This phase leads to a set of approved candidate networks. In a subsequent subtractive phase, a selection among approved networks is performed using statistical Fisher tests. The series of tests starts from a possibly too large unbiased network (the full network), and ends with the smallest unbiased network whose input variables and hidden neurons all have a significant contribution to the regression estimate. This method was successfully tested against the real-world regression problems proposed at the NIPS2000 Unlabeled Data Supervised Learning Competition; two of them are included here as illustrative examples.
机译:本文提出了一种模型选择程序,该程序强调了经典多项式模型作为评估给定建模问题的复杂性和删除非重要输入变量的工具的重要性。如果问题的复杂性使神经网络成为必要,则可以在两个阶段中进行神经候选项的选择。在加法阶段,最重要的一个阶段是训练具有越来越多隐藏神经元的候选神经网络。当舍入误差的影响变得明显时,将停止隐藏神经元的添加,例如,无法准确估计置信区间。这个阶段导致了一组批准的候选网络。在随后的减法阶段,使用统计Fisher检验在批准的网络中进行选择。这一系列的测试从可能太大的无偏网络(完整网络)开始,以最小的无偏网络结束,其输入变量和隐藏的神经元对回归估计均起重要作用。该方法已成功针对NIPS2000无标签数据监督学习竞赛中提出的真实世界回归问题进行了测试;其中包括两个作为说明性示例。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号