首页> 外国专利> LARGE MODEL EMULATION BY KNOWLEDGE DISTILLATION BASED NAS

LARGE MODEL EMULATION BY KNOWLEDGE DISTILLATION BASED NAS

机译:基于NAS的知识提取大模型仿真

摘要

Described herein is a machine learning mechanism implemented by one or more computers (506), the mechanism having access to a base neural network (104, 301, 302) and being configured to determine a simplified neural network (105, 303) by iteratively performing the following set of steps: forming (201) sample data by sampling the architecture of a current candidate neural network; selecting (202), in dependence on the sample data, an architecture for a second candidate neural network; forming (203) a trained candidate neural network by training the second candidate neural network, wherein the training of the second candidate neural network comprises applying feedback to the second candidate neural network in dependence on a comparison of the behaviours of the second candidate neural network and the base neural network (104, 301, 302); and adopting (204) the trained candidate neural network as the current candidate neural network for a subsequent iteration of the set of steps. This may allow a candidate neural network to be trained that can emulate a larger base network.
机译:本文描述的是由一台或多台计算机(506)实现的机器学习机制,该机制具有对基本神经网络(104、301、302)的访问权,并且被配置为通过迭代地执行以下一组步骤来确定简化的神经网络(105、303):通过对当前候选神经网络的架构进行采样来形成(201)样本数据;根据样本数据选择(202)第二候选神经网络的架构;通过训练所述第二候选神经网络形成(203)经过训练的候选神经网络,其中所述第二候选神经网络的训练包括依赖于所述第二候选神经网络和所述基础神经网络(104、301、302)的行为的比较向所述第二候选神经网络应用反馈;以及采用(204)训练后的候选神经网络作为当前候选神经网络,用于步骤集的后续迭代。这可能允许训练一个候选神经网络,以模拟更大的基础网络。

著录项

相似文献

  • 专利
  • 外文文献
  • 中文文献
获取专利

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号