首页> 外国专利> SELF-PRUNING NEURAL NETWORKS FOR WEIGHT PARAMETER REDUCTION

SELF-PRUNING NEURAL NETWORKS FOR WEIGHT PARAMETER REDUCTION

机译:基于自剪枝神经网络的权重参数约简

摘要

A technique to prune weights of a neural network using an analytic threshold function h(w) provides a neural network having weights that have been optimally pruned. The neural network includes a plurality of layers in which each layer includes a set of weights w associated with the layer that enhance a speed performance of the neural network, an accuracy of the neural network, or a combination thereof. Each set of weights is based on a cost function C that has been minimized by back-propagating an output of the neural network in response to input training data. The cost function C is also minimized based on a derivative of the cost function C with respect to a first parameter of the analytic threshold function h(w) and on a derivative of the cost function C with respect to a second parameter of the analytic threshold function h(w).
机译:一种使用分析阈值函数h(w)修剪神经网络权重的技术提供了一种具有经过最佳修剪的权重的神经网络。神经网络包括多个层,其中每一层包括与该层相关联的一组权重w,该权重w增强了神经网络的速度性能、神经网络的精度或其组合。每组权重基于成本函数C,该函数通过反向传播神经网络的输出以响应输入训练数据而最小化。成本函数C也基于成本函数C相对于分析阈值函数h(w)的第一参数的导数和成本函数C相对于分析阈值函数h(w)的第二参数的导数最小化。

著录项

  • 公开/公告号US2022129756A1

    专利类型

  • 公开/公告日2022-04-28

    原文格式PDF

  • 申请/专利权人 SAMSUNG ELECTRONICS CO. LTD.;

    申请/专利号US202217572625

  • 发明设计人 WEIRAN DENG;GEORGIOS GEORGIADIS;

    申请日2022-01-10

  • 分类号G06N3/08;G06N3/04;

  • 国家 US

  • 入库时间 2022-08-25 00:45:59

相似文献

  • 专利
  • 外文文献
  • 中文文献
获取专利

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号