【24h】

Taking laws out of trained neural networks

机译:采取训练有素的神经网络的法律

获取原文

摘要

In this paper, a problem of discovering numeric laws governing a trained neural network is considered. We propose new multilayer perceptrons implementing fractional rational functions, i.e. functions expressed as ratio of two polynomials of any order with a given number of components in the function numerator and denominator. Our networks can be utilized not only for the function implementation. They can also be used to extract knowledge embedded in the trained network. This is performed during the training process. The extracted laws, underlying the network operation, are expressed in the symbolic, fractional-rational-function form. Our networks provide information about the function parameters. The extraction ability results from applying proper activation functions in different perceptron layers, i.e. functions of exp(.), ln(.), (.)−1 and/or (.)2 types. Both theoretical considerations and simulation results are presented to illustrate properties of our networks.
机译:在本文中,考虑了发现管理训练有素的神经网络的数字法律的问题。我们提出了实现分数Rational函数的新多层的感知者,即表示为函数分子和分母中给定数量的组件的任何顺序的两个多项式的函数。我们的网络不仅可以用于功能实现。它们还可用于提取嵌入在训练网络中的知识。这是在培训过程中执行的。网络操作的提取规律以符号,分数函数形式表示。我们的网络提供有关函数参数的信息。提取能力是在不同的Perceptron层中应用适当的激活函数,即exp(。),ln(。),(。)− 1 和/或(。) 2 类型。提出了理论考虑和仿真结果以说明我们的网络的属性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号