首页> 外文会议>International Joint Conference on Neural Networks >Optimizing F-measure with non-convex loss and sparse linear classifiers
【24h】

Optimizing F-measure with non-convex loss and sparse linear classifiers

机译:使用非凸损失和稀疏线性分类器优化F测度

获取原文

摘要

F-measure is a popular performance metric used in classification when the dataset is unbalanced. Optimizing this measure directly is often challenging since no closed form solution exists. Current algorithms use approximations to the F-measure and design classifiers using maximum margin or logistic regression framework. These algorithms are not scalable and the classifiers designed are not robust to outliers. In this work, we propose a general framework for approximate F-measure maximization. We also propose a non-convex loss function which is robust to outliers. Use of elastic net regularizer in the problem formulation enables us to do simultaneous classifier design and feature selection. We present an efficient algorithm to solve the proposed problem formulation. The proposed algorithm is simple and is easy to implement. Numerical experiments on real-world benchmark datasets demonstrate that the proposed algorithm is fast and gives better generalization performance compared to some existing approaches. Thus, it is a powerful alternative for optimizing F-measure and designing a sparse classifier.
机译:当数据集不平衡时,F度量是用于分类的一种流行的性能指标。由于不存在封闭形式的解决方案,因此直接优化此度量通常具有挑战性。当前的算法使用F度量的近似值,并使用最大余量或逻辑回归框架设计分类器。这些算法不可扩展,设计的分类器对异常值的鲁棒性也不强。在这项工作中,我们提出了近似F测度最大化的一般框架。我们还提出了对异常值具有鲁棒性的非凸损失函数。在问题表述中使用弹性净正则器使我们能够同时进行分类器设计和特征选择。我们提出了一种有效的算法来解决所提出的问题。该算法简单易行。在实际基准数据集上的数值实验表明,与现有方法相比,该算法具有更快的泛化性能和更好的泛化性能。因此,它是优化F度量和设计稀疏分类器的有力替代方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号