首页> 外文期刊>Neural computation >Direct Estimation of the Derivative of Quadratic Mutual Information with Application in Supervised Dimension Reduction
【24h】

Direct Estimation of the Derivative of Quadratic Mutual Information with Application in Supervised Dimension Reduction

机译:二次互信息导数的直接估计及其在监督维降中的应用

获取原文
获取原文并翻译 | 示例
           

摘要

A typical goal of linear-supervised dimension reduction is to find a low-dimensional subspace of the input space such that the projected input variables preserve maximal information about the output variables. The dependence-maximization approach solves the supervised dimension-reduction problem through maximizing a statistical dependence between projected input variables and output variables. A well-known statistical dependence measure is mutual information (MI), which is based on the Kullback-Leibler (KL) divergence. However, it is known that the KL divergence is sensitive to outliers. Quadratic MI (QMI) is a variant of MI based on the distance, which is more robust against outliers than the KL divergence, and a computationally efficient method to estimate QMI from data, least squares QMI (LSQMI), has been proposed recently. For these reasons, developing a supervised dimension-reduction method based on LSQMI seems promising. However, not QMI itself but the derivative of QMI is needed for subspace search in linear-supervised dimension reduction, and the derivative of an accurate QMI estimator is not necessarily a good estimator of the derivative of QMI. In this letter, we propose to directly estimate the derivative of QMI without estimating QMI itself. We show that the direct estimation of the derivative of QMI is more accurate than the derivative of the estimated QMI. Finally, we develop a linear-supervised dimension-reduction algorithm that efficiently uses the proposed derivative estimator and demonstrate through experiments that the proposed method is more robust against outliers than existing methods.
机译:线性监督降维的典型目标是找到输入空间的低维子空间,以便投影的输入变量保留有关输出变量的最大信息。依赖最大化方法通过最大化预计的输入变量和输出变量之间的统计依赖性来解决有监督的降维问题。众所周知的统计依赖性度量是互信息(MI),该信息基于Kullback-Leibler(KL)散度。但是,众所周知,KL散度对异常值敏感。二次方MI(QMI)是基于距离的MI的一种变体,它比KL散度对异常值的鲁棒性更强,并且最近提出了一种从数据估计最小二乘QMI(LSQMI)的计算有效方法。由于这些原因,开发基于LSQMI的监督降维方法似乎很有希望。但是,在线性监督的维数约简中,子空间搜索不需要QMI本身,而是需要QMI的导数,并且精确的QMI估计器的导数不一定是QMI导数的良好估计器。在这封信中,我们建议直接估计QMI的导数,而不估计QMI本身。我们表明,直接估算QMI的导数比估算QMI的导数更准确。最后,我们开发了一种线性监督的降维算法,该算法有效地利用了所提出的导数估计量,并通过实验证明了所提出的方法比现有方法具有更强的鲁棒性。

著录项

  • 来源
    《Neural computation》 |2017年第8期|2076-2122|共47页
  • 作者单位

    University of Tokyo, Bunkyo-ku, Tokyo, 113-033, Japan;

    Nara Institute of Science and Technology, Ikoma, Nara 630-0192, Japan, and RIKEN Center for Advanced Intelligence Project, Chuo-ku, Tokyo 103-0027, Japan;

    RIKEN Center for Advanced Intelligence Project, Chuo-ku, Tokyo 103-0027, Japan, and University of Tokyo, Bunkyo-ku, 113-033, Japan;

  • 收录信息 美国《科学引文索引》(SCI);美国《化学文摘》(CA);
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号