【24h】

Learning, Entropy, Free Energy, an Underlying Commonality ?

机译:学习,熵,自由能,潜在的共性?

获取原文
获取原文并翻译 | 示例

摘要

Statistical Mechanics, which is due primarily to Maxwell, Gibbs, and Boltzmann in the ninetieth century, has proven to be useful model for drawing inferences about the collective behavior of individual objects that interact according to a known force law (which for more general usage is referred to as interacting units.). Collective behavior is determined not by computing F = ma for each interacting unit because the problem is mathematically intractable. Instead, one computes the partition function for the collection of interacting units and predicts statistical behavior from the partition function. Statistical mechanics was united with Bayesian inference by Jaynes. As a continuation, Shannon demonstrated that the partition function assignment of probabilities via the interaction Hamiltonian is the solution to Bayesian assignment of probabilities (based on the maximum entropy method with known means and standard deviations). Once this technique has been applied to a variety of problems and obtained a solution, one can, of course, solve the inverse problem to determine what interaction model gives rise to a given probability assignment. The usage of statistical mechanics allows one to draw general inferences about any complex system including networks by defining "energy", "heat capacity", "temperature", and other thermodynamic characteristics of most complex systems based on the common standard of the Helmholtz free energy. Principle has noted that the aspect of entropy used in reasoning with uncertainty may not be the most appropriate entropy for learning mechanisms. Instead he has explored using Renyi entropy and derived a form of information learning dynamics that has some promising features. To fully realize the potential of the usage of a more generalized entropy to the three aspects of survival, we suggest some connections to free energy and learning. We also connect some aspects of sensing to probability distributions that suggest why certain search strategies perform better than others. In making these connections, we suggest a fundamental connection waits to be discovered between inference, learning, and related to the manner in which sensing mechanisms perform.
机译:统计力学主要归因于90世纪的Maxwell,Gibbs和Boltzmann,已被证明是一种有用的模型,可以用来推断根据已知力定律相互作用的单个对象的集体行为(对于更普遍的用法是称为互动单元)。集体行为不是通过为每个交互单元计算F = ma来确定的,因为该问题在数学上是棘手的。取而代之的是,计算相互作用单元集合的分区函数,并根据该分区函数预测统计行为。统计力学与Jaynes的贝叶斯推理相结合。作为继续,香农证明了通过相互作用哈密顿量的概率的分区函数分配是概率的贝叶斯分配的解决方案(基于具有已知均值和标准差的最大熵方法)。一旦将这种技术应用于各种问题并获得了解决方案,那么人们当然可以解决反问题,以确定哪种相互作用模型会产生给定的概率分配。通过使用统计力学,可以根据亥姆霍兹自由能的通用标准定义大多数复杂系统的“能量”,“热容量”,“温度”和其他热力学特性,从而对包括网络在内的任何复杂系统进行一般性推断。 。 Principle指出,不确定性推理中使用的熵可能不是学习机制最合适的熵。取而代之的是,他探索了使用Renyi熵的方法,并得出了一种具有某些有前途特征的信息学习动力学形式。为了充分认识到对生存的三个方面使用更广义的熵的潜力,我们建议与自由能和学习有关。我们还将感知的某些方面与概率分布联系起来,从而说明了为什么某些搜索策略的效果要优于其他搜索策略。在进行这些连接时,我们建议在推理,学习之间以及与感应机制执行方式有关的基础连接中等待发现。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号