首页> 外文期刊>IEEE Transactions on Circuits and Systems for Video Technology >An Object-Oriented Visual Saliency Detection Framework Based on Sparse Coding Representations
【24h】

An Object-Oriented Visual Saliency Detection Framework Based on Sparse Coding Representations

机译:基于稀疏编码表示的面向对象视觉显着性检测框架

获取原文
获取原文并翻译 | 示例

摘要

Saliency detection aims at quantitatively predicting attended locations in an image. It may mimic the selection mechanism of the human vision system, which processes a small subset of a massive amount of visual input while the redundant information is ignored. Motivated by the biological evidence that the receptive fields of simple cells in V1 of the vision system are similar to sparse codes learned from natural images, this paper proposes a novel framework for saliency detection by using image sparse coding representations as features. Unlike many previous approaches dedicated to examining the local or global contrast of each individual location, this paper develops a probabilistic computational algorithm by integrating objectness likelihood with appearance rarity. In the proposed framework, image sparse coding representations are yielded through learning on a large amount of eye-fixation patches from an eye-tracking dataset. The objectness likelihood is measured by three generic cues called compactness, continuity, and center bias. The appearance rarity is inferred by using a Gaussian mixture model. The proposed paper can serve as a basis for many techniques such as image/video segmentation, retrieval, retargeting, and compression. Extensive evaluations on benchmark databases and comparisons with a number of up-to-date algorithms demonstrate its effectiveness.
机译:显着性检测旨在定量预测图像中的参与位置。它可以模仿人类视觉系统的选择机制,该机制处理大量视觉输入的一小部分,而忽略冗余信息。受生物学证据的启发,视觉系统的V1中的简单细胞的接受域与从自然图像中学习的稀疏代码相似,本文提出了一种以图像稀疏编码表示为特征的显着性检测新框架。与许多以前专门研究每个位置的局部或全局对比度的方法不同,本文通过将客观可能性与外观稀有度相结合来开发一种概率计算算法。在所提出的框架中,通过从眼动数据集中学习大量的眼动补丁来获得图像稀疏编码表示。客观可能性是通过三个普遍性指标来衡量的,即所谓的紧凑性,连续性和中心偏差。通过使用高斯混合模型来推断外观稀有度。提出的论文可以作为许多技术的基础,例如图像/视频分割,检索,重定目标和压缩。对基准数据库的广泛评估以及与许多最新算法的比较证明了其有效性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号