首页> 外文期刊>IEEE Transactions on Neural Networks >Comparing Support Vector Machines and Feedforward Neural Networks With Similar Hidden-Layer Weights
【24h】

Comparing Support Vector Machines and Feedforward Neural Networks With Similar Hidden-Layer Weights

机译:比较具有相似隐藏权重的支持向量机和前馈神经网络

获取原文
获取原文并翻译 | 示例

摘要

Support vector machines (SVMs) usually need a large number of support vectors to form their output. Recently, several models have been proposed to build SVMs with a small number of basis functions, maintaining the property that their hidden-layer weights are a subset of the data (the support vectors). This property is also present in some algorithms for feedforward neural networks (FNNs) that construct the network sequentially, leading to sparse models where the number of hidden units can be explicitly controlled. An experimental study on several benchmark data sets, comparing SVMs and the aforementioned sequential FNNs, was carried out. The experiments were performed in the same conditions for all the models, and they can be seen as a comparison of SVMs and FNNs when both models are restricted to use similar hidden-layer weights. Accuracies were found to be very similar. Regarding the number of support vectors, sequential FNNs constructed models with less hidden units than standard SVMs and in the same range as "sparse" SVMs. Computational times were lower for SVMs
机译:支持向量机(SVM)通常需要大量支持向量才能形成其输出。最近,已经提出了几种模型来构建具有少量基本函数的SVM,并保持其隐藏层权重是数据(支持向量)的子集的特性。此属性还存在于顺序构建网络的前馈神经网络(FNN)的某些算法中,从而导致稀疏模型,其中可以显式控制隐藏单元的数量。对几个基准数据集进行了实验研究,比较了SVM和上述顺序的FNN。在所有模型的相同条件下进行了实验,当两个模型都被限制使用相似的隐藏层权重时,可以将它们视为SVM和FNN的比较。发现精度非常相似。关于支持向量的数量,顺序FNN构建的模型的隐藏单元比标准SVM少,并且与“稀疏” SVM的范围相同。 SVM的计算时间更短

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号