...
首页> 外文期刊>Entropy >A Decomposition Method for Global Evaluation of Shannon Entropy and Local Estimations of Algorithmic Complexity
【24h】

A Decomposition Method for Global Evaluation of Shannon Entropy and Local Estimations of Algorithmic Complexity

机译:香农熵全局估计的分解方法和算法复杂度的局部估计

获取原文
           

摘要

We investigate the properties of a Block Decomposition Method (BDM), which extends the power of a Coding Theorem Method (CTM) that approximates local estimations of algorithmic complexity based on Solomonoff–Levin’s theory of algorithmic probability providing a closer connection to algorithmic complexity than previous attempts based on statistical regularities such as popular lossless compression schemes. The strategy behind BDM is to find small computer programs that produce the components of a larger, decomposed object. The set of short computer programs can then be artfully arranged in sequence so as to produce the original object. We show that the method provides efficient estimations of algorithmic complexity but that it performs like Shannon entropy when it loses accuracy. We estimate errors and study the behaviour of BDM for different boundary conditions, all of which are compared and assessed in detail. The measure may be adapted for use with more multi-dimensional objects than strings, objects such as arrays and tensors. To test the measure we demonstrate the power of CTM on low algorithmic-randomness objects that are assigned maximal entropy (e.g., π ) but whose numerical approximations are closer to the theoretical low algorithmic-randomness expectation. We also test the measure on larger objects including dual, isomorphic and cospectral graphs for which we know that algorithmic randomness is low. We also release implementations of the methods in most major programming languages— Wolfram Language (Mathematica), Matlab , R , Perl , Python , Pascal , C++ , and Haskell —and an online algorithmic complexity calculator.
机译:我们研究了块分解方法(BDM)的属性,该方法扩展了编码定理方法(CTM)的功能,该方法基于Solomonoff-Levin算法概率论来近似算法复杂度的局部估计,从而比以前更接近算法复杂度根据统计规则(例如流行的无损压缩方案)进行尝试。 BDM背后的策略是找到小型计算机程序,这些程序可以生成较大的分解对象的组件。然后可以按顺序巧妙地安排这组简短的计算机程序,以产生原始对象。我们证明了该方法可以有效地估计算法的复杂度,但是当它失去准确性时,其性能就像香农熵一样。我们估计误差并研究BDM在不同边界条件下的行为,将所有这些误差进行比较和详细评估。该度量可以适合于与比字符串,诸如数组和张量的对象更多的多维对象一起使用。为了测试该度量,我们证明了CTM在分配了最大熵(例如π)但其数值近似值更接近理论上的低算法随机性期望值的低算法随机性对象上的功能。我们还对较大的对象(包括对偶图,同构图和同谱图)进行了测试,我们知道算法随机性较低。我们还发布了大多数主要编程语言(Wolfram语言(Mathematica),Matlab,R,Perl,Python,Pascal,C ++和Haskell)中的方法的实现,以及在线算法复杂度计算器。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号