首页> 外国专利> Knowledge Distillation and Gradient Pruning-Based Compression of Artificial Intelligence-Based Base Caller

Knowledge Distillation and Gradient Pruning-Based Compression of Artificial Intelligence-Based Base Caller

机译:基于人工智能的基于群体的知识蒸馏和梯度修剪压缩

摘要

The technology disclosed compresses a larger, teacher base caller into a smaller, student base caller. The student base caller has fewer processing modules and parameters than the teacher base caller. The teacher base caller is trained using hard labels (e.g., one-hot encodings). The trained teacher base caller is used to generate soft labels as output probabilities during the inference phase. The soft labels are used to train the student base caller.
机译:该技术披露了将更大的教师基本呼叫者压缩到更小的学生基本呼叫者中。 学生基本呼叫者的处理模块和参数比教师基本呼叫者更少。 教师基本呼叫者使用硬标签(例如,单热编码)培训。 训练有素的教师基本呼叫者用于在推理阶段期间生成软标签作为输出概率。 软标签用于培训学生基本呼叫者。

著录项

相似文献

  • 专利
  • 外文文献
  • 中文文献
获取专利

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号