首页> 外文会议>2nd workshop on semantic deep learning 2017 >Class disjointness constraints as specific objective functions in neural network classifiers
【24h】

Class disjointness constraints as specific objective functions in neural network classifiers

机译:类不相交约束作为神经网络分类器中的特定目标函数

获取原文
获取原文并翻译 | 示例

摘要

Increasing performance of deep learning techniques on computer vision tasks like object detection has led to systems able to detect a large number of classes of objects. Most deep learning models use simple unstructured labels and assume that any domain knowledge will be learned from the data. However when the domain is complex and the data limited, it may be useful to use domain knowledge encoded in an ontology to guide the learning process. In this paper, we conduct experiments to introduce constraints into the training process of a neural network. We show that explicitly modeling a disjointness axiom between a set of classes as a specific objective function leads to reducing violations for this constraint, while also reducing the overall classification error. This opens a way to import domain knowledge modeled in an ontology into a deep learning process.
机译:深度学习技术在诸如对象检测之类的计算机视觉任务上的性能不断提高,导致系统能够检测大量类别的对象。大多数深度学习模型使用简单的非结构化标签,并假定将从数据中学习任何领域知识。但是,当领域复杂且数据有限时,使用在本体中编码的领域知识来指导学习过程可能会很有用。在本文中,我们进行了将约束引入神经网络训练过程的实验。我们表明,将一组类之间的不相交公理显式地建模为特定的目标函数,可以减少对该约束的违反,同时还可以减少总体分类错误。这开辟了一种将本体中建模的领域知识导入深度学习过程的方式。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号