首页> 外文会议>International Conference on Artificial Neural Networks >Improving the performance of multi-layer perceptrons where limited training data are available for some classes
【24h】

Improving the performance of multi-layer perceptrons where limited training data are available for some classes

机译:提高多层训练数据的多层训练数据的性能可用于某些类

获取原文

摘要

The standard multi-layer Perceptron (MLP) training algorithm implicitly assumes that equal numbers of examples are available to train each of the network classes. However, in many condition monitoring and fault diagnosis (CMFD) systems, datarepresenting fault conditions can only be obtained with great difficulty: as a result, training classes may vary greatly in size, and the overall performance of an MLP classifier may be comparatively poor. In this paper, we describe two novel techniqueswhich can help ameliorate the impact of unequal training set sizes. We demonstrate the effectiveness of these techniques using simulated fault data representative of that found in a board class of CMFD problems.
机译:标准的多层Perceptron(MLP)训练算法隐含地假设相同数量的示例可用于训练每个网络类。然而,在许多情况监测和故障诊断(CMFD)系统中,DATAREPRESENTING故障条件只能以极大的困难获得:因此,培训类的尺寸可能大大变化,并且MLP分类器的整体性能可能相对较差。在本文中,我们描述了两种新型技术可以帮助改善不平等训练套装尺寸的影响。我们使用模拟故障数据表示在CMFD问题的棋盘类别中发现的模拟故障数据来展示这些技术的有效性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号