首页> 外文会议>IEEE/CVF Conference on Computer Vision and Pattern Recognition >Structured Set Matching Networks for One-Shot Part Labeling
【24h】

Structured Set Matching Networks for One-Shot Part Labeling

机译:一站式零件贴标的结构化集合匹配网络

获取原文

摘要

Diagrams often depict complex phenomena and serve as a good test bed for visual and textual reasoning. However, understanding diagrams using natural image understanding approaches requires large training datasets of diagrams, which are very hard to obtain. Instead, this can be addressed as a matching problem either between labeled diagrams, images or both. This problem is very challenging since the absence of significant color and texture renders local cues ambiguous and requires global reasoning. We consider the problem of one-shot part labeling: labeling multiple parts of an object in a target image given only a single source image of that category. For this set-to-set matching problem, we introduce the Structured Set Matching Network (SSMN), a structured prediction model that incorporates convolutional neural networks. The SSMN is trained using global normalization to maximize local match scores between corresponding elements and a global consistency score among all matched elements, while also enforcing a matching constraint between the two sets. The SSMN significantly outperforms several strong baselines on three label transfer scenarios: diagram-to-diagram, evaluated on a new diagram dataset of over 200 categories; image-to-image, evaluated on a dataset built on top of the Pascal Part Dataset; and image-to-diagram, evaluated on transferring labels across these datasets.
机译:图表经常描绘复杂的现象,并作为视觉和文字推理的良好测试平台。但是,使用自然图像理解方法来理解图需要大量的图训练数据集,这很难获得。相反,可以将其作为带标签的图,图像或两者之间的匹配问题来解决。这个问题非常具有挑战性,因为缺少明显的颜色和纹理会导致局部提示不明确,并且需要全局推理。我们考虑一次性标记的问题:仅在给定类别的单个源图像的情况下,在目标图像中标记对象的多个部分。对于此集合对集合的匹配问题,我们介绍了结构化集合匹配网络(SSMN),这是一种结合卷积神经网络的结构化预测模型。使用全局归一化来训练SSMN,以使对应元素之间的局部匹配得分和所有匹配元素之间的全局一致性得分最大化,同时还强制两组之间的匹配约束。 SSMN在三种标签转移方案上的性能明显优于几个强大的基准:图对图,在200多个类别的新图数据集上进行评估;图像到图像,在基于Pascal零件数据集的数据集上评估;和图像到图表,在跨这些数据集传输标签时进行了评估。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号