首页> 外文会议>International conference on neural information processing >Feature Selection Using Smooth Gradient L_(1/2) Regularization
【24h】

Feature Selection Using Smooth Gradient L_(1/2) Regularization

机译:使用平滑梯度L_(1/2)正则化进行特征选择

获取原文

摘要

In terms of L_(1/2) regularization, a novel feature selection method for a neural framework model has been developed in this paper. Due to the non-convex, non-smooth and non-Lipschitz characteristics of L_(1/2) regularizer it is difficult to directly employ the gradient descent method in training multilayer perceptron neural networks. A smoothing technique has been considered to approximate the original L_(1/2) regular-izer. The proposed method is a two-stage updating approach. First, a multilayer network model with smoothing L_(1/2)regularizer is trained to eliminate the unimportant features. Second, the compact model without regularization has been simulated until there is no improvements for the performance. The experiments demonstrate that the presented algorithm significantly reduces the redundant features while keeps a considerable model accuracy.
机译:根据L_(1/2)正则化,本文提出了一种新的神经框架模型特征选择方法。由于L_(1/2)正则化器的非凸,非平滑和非Lipschitz特性,很难直接采用梯度下降法来训练多层感知器神经网络。已经考虑使用一种平滑技术来近似原始的L_(1/2)正则化器。所提出的方法是两阶段的更新方法。首先,训练具有平滑L_(1/2)调节器的多层网络模型以消除不重要的特征。其次,对没有正则化的紧凑模型进行了仿真,直到性能没有任何改善为止。实验表明,该算法显着减少了冗余特征,同时保持了相当大的模型精度。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号