【24h】

ITERATIVE DESIGN OF NEURAL NETWORK CLASSIFIERS THROUGH REGRESSION

机译:基于回归的神经网络分类器的迭代设计

获取原文
获取原文并翻译 | 示例
           

摘要

A method which modifies the objective function used for designing neural network classifiers is presented. The classical mean-square error criteria is relaxed by introducing two types of local error bias which are treated like free parameters. Open and closed form solutions are given for finding these bias parameters. The new objective function is seamlessly integrated into existing training algorithms such as back propagation (BP), output weight optimization (OWO), and hidden weight optimization(HWO). The resulting algorithms are successfully applied in training neural net classifiers having a linear final layer. Classifiers are trained and tested on several data sets from pattern recognition applications. Improvement over classical iterative regression methods is clearly demonstrated.
机译:提出了一种修改用于设计神经网络分类器的目标函数的方法。通过引入两种像自由参数一样对待的局部误差偏差,可以放宽经典的均方误差标准。给出了用于查找这些偏差参数的开放式和封闭式解决方案。新的目标函数无缝集成到现有的训练算法中,例如反向传播(BP),输出权重优化(OW​​O)和隐藏权重优化(HWO)。所得算法成功地用于训练具有线性最终层的神经网络分类器。分类器在模式识别应用程序的多个数据集上进行训练和测试。明显证明了对经典迭代回归方法的改进。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号