...
首页> 外文期刊>Pattern Recognition: The Journal of the Pattern Recognition Society >Training neural network classifiers through Bayes risk minimization applying unidimensional Parzen windows
【24h】

Training neural network classifiers through Bayes risk minimization applying unidimensional Parzen windows

机译:通过贝叶斯风险最小化培训神经网络分类器,其应用单向平移窗口

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

A new training algorithm for neural networks in binary classification problems is presented. It is based on the minimization of an estimate of the Bayes risk by using Parzen windows applied to the final one-dimensional nonlinear transformation of the samples to estimate the probability of classification error. This leads to a very general approach to error minimization and training, where the risk that is to be minimized is defined in terms of integrated one-dimensional Parzen windows, and the gradient descent algorithm used to minimize this risk is a function of the window that is used. By relaxing the constraints that are typically applied to Parzen windows when used for probability density function estimation, for example by allowing them to be non-symmetric or possibly infinite in duration, an entirely new set of training algorithms emerge. In particular, different Parzen windows lead to different cost functions, and some interesting relationships with classical training methods are discovered. Experiments with synthetic and real benchmark datasets show that with the appropriate choice of window, fitted to the specific problem, it is possible to improve the performance of neural network classifiers over those that are trained using classical methods. (C) 2017 Elsevier Ltd. All rights reserved.
机译:提出了一种新的分类问题中神经网络的新培训算法。它基于通过使用应用于样本的最终一维非线性变换的Parzen Windows来最小化贝叶斯风险,以估计分类误差的概率。这导致了一种非常普遍的误差方法,以最小化和培训,其中最小化的风险是在集成的一维平移窗口方面定义的,并且用于最小化这种风险的梯度下降算法是窗口的函数用来。通过放松通常用于概率密度函数估计时通常应用于Parzen窗口的约束,例如通过允许它们在持续时间内是非对称的或可能的无限的,这是一种完全新的训练算法出现。特别是,不同的Parzen Windows导致不同的成本函数,并且发现了一些与经典培训方法的有趣关系。 Experiments with synthetic and real benchmark datasets show that with the appropriate choice of window, fitted to the specific problem, it is possible to improve the performance of neural network classifiers over those that are trained using classical methods. (c)2017 Elsevier Ltd.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号