首页> 外文会议>IFAC International Workshop on Adaptation and Learning in Control and Signal Processing >An Information Theoretic Approach to Constructing Machine Learning Criteria
【24h】

An Information Theoretic Approach to Constructing Machine Learning Criteria

机译:构建机器学习标准的信息理论方法

获取原文

摘要

Selecting a learning criterion is a constituent part of a machine learning problem statement requiring both accounting its adequacy to the data available and practical suitability of implementation. The paper presents an approach to the machine learning in accordance to information-theoretic criteria that are derived basing on the Renyi entropy of an arbitrary order. Meanwhile, a parameterized description of the machine learning is utilized combined with a corresponding technique of estimation of mutual information constructed basing on the Renyi entropies. This leads, finally, to a problem of the finite dimensional optimization to be solved by a suitable technique. The consideration proposed is preceded by a thorough review of existing information theoretic and entropy based approaches to the machine learning. The paper has been supported by a grant of the Russian Foundation for Basic Researches (RFBR): project 12-08-01205-a.
机译:选择学习标准是机器学习问题陈述的组成部分,需要对可用数据的数据和实际适用性进行充分性。本文介绍了根据机器学习的方法,该方法是根据基于任意顺序的仁义熵的基于renyi熵的信息 - 理论理学标准。同时,利用了机器学习的参数化描述与在仁义熵上构建的相互信息估计的相应技术组合。最后,这导致了通过合适的技术解决的有限尺寸优化的问题。提出的代价在于对机器学习的现有信息理论和熵的方法进行了彻底审查。本文得到了俄罗斯基础研究基础(RFBR)的支持:项目12-08-01205-a。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号