首页> 外文OA文献 >Growing Subspace Pattern Recognition Methods and Their Neural-Network Models
【2h】

Growing Subspace Pattern Recognition Methods and Their Neural-Network Models

机译:增长的子空间模式识别方法及其神经网络模型

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。
获取外文期刊封面目录资料

摘要

In statistical pattern recognition, the decision of which features to use is usually left to human judgment. If possible, automatic methods are desirable. Like multilayer perceptrons, learning subspace methods (LSMs) have the potential to integrate feature extraction and classification. In this paper, we propose two new algorithms, along with their neural-network implementations, to overcome certain limitations of the earlier LSMs. By introducing one cluster at a time and adapting it if necessary, we eliminate one limitation of deciding how many clusters to have in each class by trial-and-error. By using the principal component analysis neural networks along with this strategy, we propose neural-network models which are better in overcoming another limitation, scalability. Our results indicate that the proposed classifiers are comparable to classifiers like the multilayer perceptrons and the nearest-neighbor classifier in terms of classification accuracy. In terms of classification speed and scalability in design, they appear to be better for large-dimensional problems.
机译:在统计模式识别中,使用哪种功能的决策通常留给人类判断。如果可能,则需要自动方法。像多层感知器一样,学习子空间方法(LSM)具有集成特征提取和分类的潜力。在本文中,我们提出了两种新算法及其神经网络实现,以克服早期LSM的某些局限性。通过一次引入一个集群并在必要时对其进行调整,我们消除了一个限制,即通过反复试验来确定每个类别中有多少个集群。通过使用主成分分析神经网络和该策略,我们提出了一种神经网络模型,可以更好地克服另一个局限性,可扩展性。我们的结果表明,在分类精度方面,拟议的分类器可与多层感知器和最近邻分类器等分类器媲美。就设计中的分类速度和可伸缩性而言,它们似乎更适合解决大尺寸问题。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号