首页> 外国专利> Non-Uniform Regularization in Artificial Neural Networks for Adaptable Scaling

Non-Uniform Regularization in Artificial Neural Networks for Adaptable Scaling

机译:用于适应性缩放的人工神经网络中的非均匀正则化

摘要

A system for flexible regularization and adaptable scaling of an artificial neural network is provided. The system includes a memory to store an artificial neural network and training data, a processor and interface to submit signals and training data into the neural network having a sequence of layers, each layer includes a set of neuron nodes, wherein a pair of nodes from neighboring layers are mutually connected with a plural of trainable parameters to pass the signals from the previous layer to next layer, a random number generator to modify the output signal of each neuron nodes for regularization in a stochastic manner following a multi-dimensional distribution across layer depth and node width directions of the neural network, wherein at least one layer has non-identical profile across neuron nodes, a training operator to update the neural network parameters by using the training data such that the output of neural network provides better values in a plural of objective functions; and an adaptive truncator to prune the output of neuron nodes at each layer in a compressed size of the neural network to reduce the computational complexity on the fly in downstream testing phase for any new incoming data.
机译:提供了一种用于灵活的正规化和适应性缩放的人工神经网络。该系统包括存储人工神经网络和训练数据的存储器,处理器和接口将信号和训练数据提交到具有一系列层的神经网络中,每层包括一组神经节点,其中来自的一对节点相邻层与复数的可训练参数相互连接,以将信号从上一层传递到下一个层,随机数发生器以在跨层的多维分布之后以随机的方式以随机的方式修改每个神经元节点的输出信号神经网络的深度和节点宽度方向,其中至少一层在神经元节点上具有非相同的简档,训练操作者通过使用训练数据来更新神经网络参数,使得神经网络的输出提供更好的值物质功能;和一个自适应截断,用于在神经网络的压缩大小的压缩大小的每个层处修剪神经元节点的输出,以减少任何新的传入数据的下游测试阶段的计算复杂性。

著录项

相似文献

  • 专利
  • 外文文献
  • 中文文献
获取专利

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号