首页> 外文期刊>電子情報通信学会技術研究報告 >Feature Selection via e_1-Penalized Squared-Loss Mutual Information
【24h】

Feature Selection via e_1-Penalized Squared-Loss Mutual Information

机译:通过e_1小化平方和互损失信息进行特征选择

获取原文
获取原文并翻译 | 示例
           

摘要

Feature selection is a technique to screen out less important features. Many existing supervised feature selection algorithms use redundancy and relevancy as the main criteria to select features. However, feature interaction, potentially a key characteristic in real-world problems, has not received much attention. As an attempt to take feature interaction into account, we propose e_1-LSMI, an e_1-regularization based algorithm that maximizes a squared-loss variant of mutual information between selected features and outputs. Numerical results show that e_1-LSMI performs well in handling redundancy, detecting non-linear dependency, and considering feature interaction.
机译:特征选择是一种用于筛选次要特征的技术。许多现有的受监督特征选择算法都使用冗余和相关性作为选择特征的主要标准。但是,功能交互(可能是现实世界中的关键特征)并未引起太多关注。为了尝试考虑特征交互,我们提出了e_1-LSMI,这是一种基于e_1正则化的算法,该算法可最大化所选特征和输出之间互信息的平方损失变型。数值结果表明,e_1-LSMI在处理冗余,检测非线性相关性以及考虑特征交互方面表现良好。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号