首页> 外文期刊>JMLR: Workshop and Conference Proceedings >Programming with a Differentiable Forth Interpreter
【24h】

Programming with a Differentiable Forth Interpreter

机译:使用可区分的Forth解释器进行编程

获取原文
           

摘要

Given that in practice training data is scarce for all but a small set of problems, a core question is how to incorporate prior knowledge into a model. In this paper, we consider the case of prior procedural knowledge for neural networks, such as knowing how a program should traverse a sequence, but not what local actions should be performed at each step. To this end, we present an end-to-end differentiable interpreter for the programming language Forth which enables programmers to write program sketches with slots that can be filled with behaviour trained from program input-output data. We can optimise this behaviour directly through gradient descent techniques on user-specified objectives, and also integrate the program into any larger neural computation graph. We show empirically that our interpreter is able to effectively leverage different levels of prior program structure and learn complex behaviours such as sequence sorting and addition. When connected to outputs of an LSTM and trained jointly, our interpreter achieves state-of-the-art accuracy for end-to-end reasoning about quantities expressed in natural language stories.
机译:鉴于在实践中除少数问题外几乎没有培训数据,一个核心问题是如何将先验知识整合到模型中。在本文中,我们考虑了先验的神经网络程序知识,例如知道程序应如何遍历序列,而不是在每个步骤中应执行哪些局部动作。为此,我们提供了一种用于编程语言Forth的端到端微分解释器,它使程序员能够编写带有插槽的程序草图,这些插槽可以填充有从程序输入输出数据训练的行为。我们可以通过梯度下降技术针对用户指定的目标直接优化此行为,也可以将该程序集成到任何较大的神经计算图中。我们凭经验证明,我们的解释器能够有效利用不同层次的先前程序结构,并学习复杂的行为,例如序列排序和加法。当连接到LSTM的输出并经过共同培训时,我们的口译员可以对自然语言故事中表达的数量进行端到端推理,从而达到最新的准确性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号