首页> 外文会议>World Congress on Engineering >A Computational Test of the Information-Theory Based Entropy Theory of Perception: Does It Actually Generate the Stevens and Weber-Fechner Laws of Sensation?
【24h】

A Computational Test of the Information-Theory Based Entropy Theory of Perception: Does It Actually Generate the Stevens and Weber-Fechner Laws of Sensation?

机译:基于信息理论的熵理论的计算测试:它实际上是生成史蒂文斯和韦伯 - 菲奇纳的感觉法吗?

获取原文

摘要

K.H. Norwich et al. used Shannon Information Theory to derive their Entropy Theory of Perception (1975-present). The Entropy Theory produces the Entropy Equation, which relates the strength of sensation (represented by magnitude estimates) to the intensity of the sensory stimulus. At "high" intensities, the relation is approximately logarithmic, which Norwich et al. dubbed "the Weber-Fechner Law". At "low" intensities, the relation is approximately a power function, dubbed "Stevens' Law". Unfortunately, the Entropy Equation has three unknowns, so that what constitutes "high" and "low" can only be established through curve-fitting. Remarkably, the latter was never done. Establishing parameter values is especially important because one of the unknowns is a power exponent (the "Entropy Exponent", here denoted y) said to be identical in value to "Stevens' exponent" (here denoted x). The identity y=x was crucial to the numerous published applications of the Entropy Theory to psychophysical and neurophysiological phenomena. Curve-fitting of the Entropy Equation to magnitude estimates would therefore establish the ranges of the "Weber-Fechner" and "Stevens" laws and reveal whether y=x. The present author did the curve-fitting, following the custom in the literature: logarithmic forms of the Entropy Equation and Stevens' Law were fitted by least-squares regression to logarithm(magnitude-estimate) vs. logarithm(stimulus-strength) taken from 64 published curves of magnitude estimates. The resulting relation of y to x was broadly scattered; 62/64 times, y exceeded x. In theory, the fitted Entropy Equation allows calculation of the information transmitted in perception. Hence the regressions were re-run conditional to an information transmitted of 2.5 bits/stimulus, the mean value in the literature. y≈1.7x under the constrained regression. Altogether, the purported equality of the Entropy Exponent and Stevens' exponent was not confirmed. Further, neither the "Weber-Fechner Law" nor the "Stevens' Law" derived from any fitted Entropy Equation described the entire range of the respective magnitude estimation curve, contrary to the formal use of those laws. Norwich's later quantification of sensation growth by "physical entropy" makes identical mistakes. All of this emphasizes that the Entropy Theory does not derive rules of sensory perception from information theory, and it is recommended that further attempts to do so should be discouraged.
机译:K.H.诺威奇等人。使用Shannon信息理论导出了他们的熵理论(1975年 - 至今)。熵理论产生熵方程,其将感觉强度(由幅度估计值表示)与感觉刺激的强度相关。在“高”强度下,关系近似对数,诺维奇等人。被称为“Weber-Fechner法”。在“低”的强度下,关系大致是一个功率函数,被称为“史蒂文斯的法律”。不幸的是,熵方程有三个未知数,因此只能通过曲线配件建立“高”和“低”。值得注意的是,后者从未完成过。建立参数值尤为重要,因为未知数之一是权力指数(这里的“熵指数”,这里表示Y)表示相同的值为“Stevens”指数“(这里表示为x)。身份y = x对熵理论的许多公开应用至心理物理和神经生理现象至关重要。因此,熵方程对幅度估计的曲线将建立“Weber-Fechner”和“史蒂文斯”法则的范围,并揭示y = x。当前作者在文献中的习惯之后做了曲线拟合:熵方程和史蒂文斯定律的对数形式被对对数(幅度估计)的最小二乘回归拟合到对数(刺激强度)所采取的64发布的幅度估计曲线。 y到x的得到的关系大致分散; 62/64次,y超过x。理论上,拟合熵方程允许计算在感知中传输的信息。因此,回归被重新运行到传输的2.5位/刺激的信息,文献中的平均值。 y≈1.7x在约束回归下。完全是,熵指数和史蒂文斯指数的声称平等没有得到证实。此外,“Weber-Fechner法”和“史蒂文斯定律”源自任何拟合熵方程都没有描述各大幅度估计曲线的整个范围,这与这些法律的正式使用相反。诺威奇通过“物理熵”的感觉增长量的量化使得相同的错误。所有这些都强调,熵理论没有从信息理论中导出感官感知的规则,建议不鼓励进一步尝试这样做。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号