首页> 外文会议> >A Measure of Relative Entropy between Individual Sequences with Application to Universal Classification
【24h】

A Measure of Relative Entropy between Individual Sequences with Application to Universal Classification

机译:单个序列之间的相对熵的度量及其在通用分类中的应用

获取原文

摘要

A new notion of empirical informational divergence between two individual sequences is introduced. If the two sequences are independent realizations of two stationary Markov processes, the empirical relative entropy converges to the true divergence almost surely. This new empirical divergence is based on a version of the Lempel-Ziv data compression algorithm. A simple universal classification algorithm for individual sequences into a finite number of classes which is based on the empirical divergence, is introduced. It discriminates between the classes whenever they are distinguishable by some finite-memory classifier, for almost every given training sets and almost any test sequence from these classes. It is universal in the sense of being independent of the unknown sources.
机译:引入了两个独立序列之间经验信息差异的新概念。如果两个序列是两个平稳马尔可夫过程的独立实现,则经验相对熵几乎可以肯定地收敛到真实散度。这种新的经验差异是基于Lempel-Ziv数据压缩算法的版本。介绍了一种基于经验差异的简单通用分类算法,将单个序列分为有限的几类。只要有一些有限的内存分类器就可以区分这些类,就几乎对每个给定的训练集和这些类的几乎任何测试序列都进行区分。从独立于未知来源的意义上讲,它是通用的。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号