首页> 外国专利> DECENTRALISED ARTIFICIAL INTELLIGENCE (AI)/MACHINE LEARNING TRAINING SYSTEM

DECENTRALISED ARTIFICIAL INTELLIGENCE (AI)/MACHINE LEARNING TRAINING SYSTEM

机译:去中心化人工智能(AI)/机器学习培训系统

摘要

A decentralized training platform is described for training an Artificial Intelligence (AI) model where training data (e.g., medical images) is distributed across multiple sites (nodes) and due to confidentiality, legal, or other reasons the data at each site is unable to be shared or leave the site and so cannot be copied to a central location for training. The method comprises training a teacher model locally at each node and then moving each of the teacher models to a central node and using these to train a student model using a transfer dataset. This may be facilitated by setting up the cloud service using inter-region peering connections between the nodes to make the nodes appear as a single cluster. In one variation the student module may be trained at each node using the multiple trained teacher models. In another variation we train multiple student models where each student model is trained by each teacher model at the node the teacher model was trained on, and once the plurality of student models are trained, an ensemble model is generated from the plurality of trained student models. Loss function weighting and node under sampling to enable load balancing may be used to improve accuracy and time/cost efficiency.
机译:描述了一个去中心化的训练平台,用于训练人工智能 (AI) 模型,其中训练数据(例如,医学图像)分布在多个站点(节点)上,并且由于机密性、法律或其他原因,每个站点的数据无法共享或离开站点,因此无法复制到中心位置进行训练。该方法包括在每个节点上本地训练教师模型,然后将每个教师模型移动到中心节点,并使用这些模型使用转移数据集训练学生模型。这可以通过使用节点之间的区域间对等连接来设置云服务,使节点显示为单个群集。在一种变体中,可以使用多个训练有素的教师模型在每个节点上训练学生模块。在另一种变体中,我们训练多个学生模型,其中每个学生模型由每个教师模型在训练教师模型的节点上训练,一旦训练了多个学生模型,就会从多个经过训练的学生模型中生成一个集成模型。损失函数加权和节点采样以实现负载均衡,可用于提高准确性和时间/成本效率。

著录项

相似文献

  • 专利
  • 外文文献
  • 中文文献
获取专利

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号