首页> 外文会议>ISNN 2013 >A Tensor Factorization Based Least Squares Support Tensor Machine for Classification
【24h】

A Tensor Factorization Based Least Squares Support Tensor Machine for Classification

机译:基于张量分解的基于分类的最小二乘支持张量机

获取原文

摘要

In the fields of machine learning, image processing, and pattern recognition, the existing least squares support tensor machine for tensor classification involves a non-convex optimization problem and needs to be solved by the iterative technique. Obviously, it is very time-consuming and may suffer from local minima. In order to overcome these two shortcomings, in this paper, we present a tensor factorization based least squares support tensor machine (TFLS-STM) for tensor classification. In TFLS-STM, we combine the merits of least squares support vector machine (LS-SVM) and tensor rank-one decomposition. Theoretically, TFLS-STM is an extension of the linear LS-SVM to tensor patterns. When the input patterns are vectors, TFLS-STM degenerates into the standard linear LS-SVM. A set of experiments is conducted on six secondorder face recognition datasets to illustrate the performance of TFLS-STM. The experimental results show that compared with the alternating projection LSSTM (APLS-STM) and LS-SVM, the training speed of TFLS-STM is faster than those of APLS-STM and LS-SVM. In term of testing accuracy, TFLSSTM is comparable with LS-SVM and is superiors to APLS-STM.
机译:在机器学习,图像处理,和模式识别的领域,现有的最小二乘支持张量机张量分类涉及非凸优化问题,并且需要由迭代技术来解决。很显然,这是非常耗时的,可能与局部极小受到影响。为了克服这两个缺点,在本文中,我们提出了一种张量分解基于最小二乘支持张量机(TFLS-STM),用于张量分类。在TFLS-STM中,我们结合最小二乘支持向量机(LS-SVM)和张量秩一分解的优点。理论上,TFLS-STM是线性LS-SVM到张量模式的扩展。当输入模式是载体,TFLS-STM退化到标准线性LS-SVM。一组实验是在六个二阶脸部识别数据集进行说明TFLS-STM的性能。实验结果表明,与交替投影LSSTM(APLS-STM)和LS-SVM,TFLS-STM的训练速度大于那些APLS-STM和LS-SVM的快相比较。在测试精度的长期,TFLSSTM是具有相当的LS-SVM和上级是对APLS-STM。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号