首页>
外国专利>
Knowledge Distillation and Gradient Pruning-Based Compression of Artificial Intelligence-Based Base Caller
Knowledge Distillation and Gradient Pruning-Based Compression of Artificial Intelligence-Based Base Caller
展开▼
机译:基于人工智能的基于群体的知识蒸馏和梯度修剪压缩
展开▼
页面导航
摘要
著录项
相似文献
摘要
The technology disclosed compresses a larger, teacher base caller into a smaller, student base caller. The student base caller has fewer processing modules and parameters than the teacher base caller. The teacher base caller is trained using hard labels (e.g., one-hot encodings). The trained teacher base caller is used to generate soft labels as output probabilities during the inference phase. The soft labels are used to train the student base caller.
展开▼