首页> 外文会议>International Conference on Algorithmic Learning Theory >Dimension-Adaptive Bounds on Compressive FLD Classification
【24h】

Dimension-Adaptive Bounds on Compressive FLD Classification

机译:压缩FLD分类上的维度 - 自适应界限

获取原文

摘要

Efficient dimensionality reduction by random projections (RP) gains popularity, hence the learning guarantees achievable in RP spaces are of great interest. In finite dimensional setting, it has been shown for the compressive Fisher Linear Discriminant (FLD) classifier that for good generalisation the required target dimension grows only as the log of the number of classes and is not adversely affected by the number of projected data points. However these bounds depend on the dimensionality d of the original data space. In this paper we give further guarantees that remove d from the bounds under certain conditions of regularity on the data density structure. In particular, if the data density does not fill the ambient space then the error of compressive FLD is independent of the ambient dimension and depends only on a notion of 'intrinsic dimension'.
机译:随机投影的高度减少(RP)获得流行,因此RP空间中可实现的学习保证具有很大的兴趣。在有限尺寸设置中,已经显示了压缩Fisher线性判别(FLD)分类器,即良好的概括,所需的目标维度仅在于类的日志,并且不会受到投影数据点数的不利影响。然而,这些界限取决于原始数据空间的维度D。在本文中,我们进一步保证了在数据密度结构上的某些规则条件下从界限中删除d。特别地,如果数据密度不填充环境空间,则压缩FLD的误差独立于环境维度,并且仅取决于“内在维度”的概念。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号