首页> 外国专利> Distilling from Ensembles to Improve Reproducibility of Neural Networks

Distilling from Ensembles to Improve Reproducibility of Neural Networks

机译:从集合蒸馏以提高神经网络的再现性

摘要

Systems and methods can improve the reproducibility of neural networks by distilling from ensembles. In particular, aspects of the present disclosure are directed to a training scheme that utilizes a combination of an ensemble of neural networks and a single, “wide” neural network that is more powerful (e.g., exhibits a greater accuracy) than the ensemble. Specifically, the output of the ensemble can be distilled into the single neural network during training of the single neural network. After training, the single neural network can be deployed to generate inferences. In such fashion, the single neural model can provide a superior prediction accuracy while, during training, the ensemble can serve to influence the single neural network to be more reproducible. In addition, an additional single wide tower can be added to generate another output, that can be distilled to the single neural network, to further improve its accuracy.
机译:系统和方法可以通过从集合中蒸馏来提高神经网络的再现性。特别地,本公开的各方面涉及一种利用神经网络的集合的组合和单个“宽”神经网络的组合,该神经网络更强大(例如,表现出更高的精度)。具体地,可以在单个神经网络的训练期间蒸馏到单个神经网络中的集合的输出。培训后,可以部署单个神经网络以产生推断。以这种方式,单个神经模型可以提供优越的预测精度,而在训练期间,该集合可以用于影响单个神经网络以更加可再现。另外,可以添加另外的单个宽塔以产生另一个输出,可以蒸馏到单个神经网络,以进一步提高其精度。

著录项

  • 公开/公告号US2021158156A1

    专利类型

  • 公开/公告日2021-05-27

    原文格式PDF

  • 申请/专利权人 GOOGLE LLC;

    申请/专利号US202017025418

  • 发明设计人 GIL SHAMIR;LORENZO COVIELLO;

    申请日2020-09-18

  • 分类号G06N3/08;G06N3/04;

  • 国家 US

  • 入库时间 2022-08-24 18:55:08

相似文献

  • 专利
  • 外文文献
  • 中文文献
获取专利

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号