【24h】

Higher-order Derivatives of Weighted Finite-state Machines

机译:加权有限状态机的高阶衍生物

获取原文

摘要

Weighted finite-state machines are a fundamental building block of NLP systems. They have withstood the test of time-from their early use in noisy channel models in the 1990s up to modern-day neurally parameterized conditional random fields. This work examines the computation of higher-order derivatives with respect to the normalization constant for weighted finite-state machines. We provide a general algorithm for evaluating derivatives of all orders, which has not been previously described in the literature. In the case of second-order derivatives, our scheme runs in the optimal O(A~2N~4) time where A is the alphabet size and N is the number of states. Our algorithm is significantly faster than prior algorithms. Additionally, our approach leads to a significantly faster algorithm for computing second-order expectations, such as covariance matrices and gradients of first-order expectations.
机译:加权有限状态机是NLP系统的基本构建块。 他们已经经受过时的时间考验 - 从他们早期在20世纪90年代的嘈杂渠道模型中使用,到了现代的神经参数化条件随机字段。 这项工作检查了对加权有限状态机器的归一化常数的高阶导数的计算。 我们提供了一种用于评估所有订单的衍生品的一般算法,该算法尚未在文献中描述。 在二阶导数的情况下,我们的方案在最佳O(A〜2n〜4)的时间内运行,其中A是字母大小,n是状态的数量。 我们的算法明显比现有算法快。 另外,我们的方法导致了用于计算二阶期望的显着更快的算法,例如协方差矩阵和一阶期望的梯度。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号