首页> 外文期刊>Computational linguistics >On the Derivational Entropy of Left-to-Right Probabilistic Finite-State Automata and Hidden Markov Models
【24h】

On the Derivational Entropy of Left-to-Right Probabilistic Finite-State Automata and Hidden Markov Models

机译:左右概率有限状态自动机和隐马尔可夫模型的导数熵

获取原文
           

摘要

Probabilistic finite-state automata are a formalism that is widely used in many problems of automatic speech recognition and natural language processing. Probabilistic finite-state automata are closely related to other finite-state models as weighted finite-state automata, word lattices, and hidden Markov models. Therefore, they share many similar properties and problems. Entropy measures of finite-state models have been investigated in the past in order to study the information capacity of these models. The derivational entropy quantifies the uncertainty that the model has about the probability distribution it represents. The derivational entropy in a finite-state automaton is computed from the probability that is accumulated in all of its individual state sequences. The computation of the entropy from a weighted finite-state automaton requires a normalized model. This article studies an efficient computation of the derivational entropy of left-to-right probabilistic finite-state automata, and it introduces an efficient algorithm for normalizing weighted finite-state automata. The efficient computation of the derivational entropy is also extended to continuous hidden Markov models.
机译:概率有限状态自动机是一种形式主义,广泛用于自动语音识别和自然语言处理的许多问题。概率有限状态自动机与其他有限状态模型紧密相关,例如加权有限状态自动机,词格和隐马尔可夫模型。因此,它们具有许多相似的特性和问题。过去已经研究了有限状态模型的熵测度,以研究这些模型的信息容量。导数熵量化了模型对模型表示的概率分布的不确定性。有限状态自动机中的导数熵是根据在其所有单个状态序列中累积的概率计算得出的。从加权有限状态自动机计算熵需要归一化模型。本文研究了从左到右的概率有限状态自动机的导数熵的有效计算,并介绍了一种有效的标准化加权有限状态自动机的算法。导数熵的有效计算也扩展到连续的隐马尔可夫模型。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号