【24h】

Recurrent Neural Networks as Weighted Language Recognizers

机译:递归神经网络作为加权语言识别器

获取原文

摘要

We investigate the computational complexity of various problems for simple recurrent neural networks (RNNs) as formal models for recognizing weighted languages. We focus on the single-layer, ReLU-activation, rational-weight RNNs with softmax, which are commonly used in natural language processing applications. We show that most problems for such RNNs are undecidable, including consistency, equivalence, minimization, and the determination of the highest-weighted string. However, for consistent RNNs the last problem becomes decidable, although the solution length can surpass all computable bounds. If additionally the string is limited to polynomial length, the problem becomes NP-complete. In summary, this shows that approximations and heuristic algorithms are necessary in practical applications of those RNNs.
机译:我们研究简单递归神经网络(RNN)作为识别加权语言的形式模型的各种问题的计算复杂性。我们专注于具有softmax的单层ReLU激活,合理加权RNN,它们通常在自然语言处理应用程序中使用。我们表明,此类RNN的大多数问题是不确定的,包括一致性,等价性,最小化以及最高权重字符串的确定。但是,对于一致的RNN,尽管解决方案长度可以超过所有可计算的范围,但最后一个问题变得可以确定。如果另外将字符串限制为多项式长度,则问题将变为NP完全。总之,这表明逼近和启发式算法在那些RNN的实际应用中是必需的。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号