首页> 外文会议>2011 IEEE Statistical Signal Processing Workshop >Least-squares LDA via rank-one updates with concept drift
【24h】

Least-squares LDA via rank-one updates with concept drift

机译:通过等级更新的最小二乘LDA与概念漂移

获取原文

摘要

Standard linear discriminant analysis (LDA) is known to be computationally expensive due to the need to perform eigen-analysis. Based on the recent success of least-squares LDA (LSLDA), we propose a novel rank-one update method for LSLDA, which not only alleviates the computation and memory requirements, and is also able to solve the adaptive learning task of concept drift. In other words, our proposed LSLDA can efficiently capture the information from recently received data with gradual or abrupt changes in distribution. Moreover, our LSLDA can be extended to recognize data with newly-added class labels during the learning process, and thus exhibits excellent scalability. Experimental results on both synthetic and real datasets confirm the effectiveness of our propose method.
机译:已知标准线性判别分析(LDA)由于需要进行特征分析,可以计算得昂贵。基于最近的最小二乘LDA(LSLDA)的成功,我们提出了一种新的LSLDA更新方法,这不仅可以减轻计算和内存要求,并且还能够解决概念漂移的自适应学习任务。换句话说,我们提出的LSLDA可以有效地捕获最近接收的数据的信息,并逐步逐步或突然发生分布变化。此外,我们的LSLDA可以扩展到学习过程中与新添加的类标签识别数据,因此表现出优异的可扩展性。合成和实际数据集的实验结果证实了我们提出方法的有效性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号