首页> 外文学位 >Learning contour statistics from natural images.
【24h】

Learning contour statistics from natural images.

机译:从自然图像中学习轮廓统计。

获取原文
获取原文并翻译 | 示例

摘要

Vision is one of our most important senses, and accordingly, a large fraction of our cerebral cortex is devoted to visual processing. One of the key computations in the early stages of the visual system is the extraction of object contours, since the occluding boundaries and orientation discontinuities of objects that make up the "line drawing" of a scene are the most important and direct cues to object shape. Contour detection in natural scenes has proven to be a difficult technical problem, however, mainly because the existence (or not) of a contour at a given location, orientation and scale depends, probabilistically, on a large number of cues covering a large fraction of the visual field. Worse, the cues interact with each other in a multitude of ways and combinations, leading to an enormously complex cue integration problem.;In this work, we attempt to break down the contour extraction problem into natural, tractable, modular subcomputations. In chapter 2, we describe a novel approach to combining local edge cues from the area generally orthogonal to the orientation of the contour. Key aspects of the approach are the (1) tabulation and modeling of contour statistics at fixed values of local edge contrast, to reduce higher-order dependencies within the population of local edge cues, and (2) picking the most informative contour cues from the de-correlated local edge population. The resulting contour operator has no parameters and has significantly improved localization and sharpened orientation tuning compared to the raw local filter values. In chapter 3, we describe a novel approach to combining cues from the area generally tangential to the contour. In this case, we have developed an approach to efficiently gather the contour statistics needed to optimally use "contextual" cues from aligned high-resolution flankers and the superimposed coarse-scale edges. In the process, we have found evidence that the integration of these contextual cues across scales can be achieved by a cascade of simple 2-input functions, greatly simplifying our statistics-driven approach. We found that the interaction between two collinear flankers is similar to a minimum like operation, center response and flankers interaction is conjunctive/contextual, and for the particular case of cross scale interaction we looked at there was minimal interaction. We generalized the approach developed for collinear cues to cues for curved contours. The resulting contextually-boosted contour operator strongly emphasizes spatially-extended contours found in natural scenes, again with zero parameters. We also describe a novel image enhancement method based on Cornsweet illusion using contours obtained from the above two methods.
机译:视觉是我们最重要的感觉之一,因此,大脑皮层的很大一部分致力于视觉处理。视觉系统早期阶段的关键计算之一是对象轮廓的提取,因为构成场景“线条图”的对象的遮挡边界和方向不连续性是对对象形状最重要且最直接的线索。然而,事实证明,在自然场景中进行轮廓检测是一个困难的技术问题,主要是因为在给定位置,方向和比例上是否存在轮廓,概率性地取决于覆盖大部分图像的大量线索。视野。更糟糕的是,这些提示以多种方式和组合方式相互影响,从而导致一个极其复杂的提示集成问题。在这项工作中,我们试图将轮廓提取问题分解为自然的,易于处理的模块化子计算。在第2章中,我们描述了一种新颖的方法,可以将通常与轮廓方向正交的区域中的局部边缘提示进行组合。该方法的关键方面是(1)以固定的局部边缘对比度值对轮廓统计数据进行制表和建模,以减少局部边缘提示群体中的高阶依赖性,以及(2)从图像中选择最有用的轮廓提示。不相关的本地边缘人口。与原始局部滤波器值相比,所得到的轮廓算子没有参数,并且具有显着改善的定位和锐化的方向调整。在第3章中,我们描述了一种新颖的方法来组合来自大致与轮廓线相切的区域的提示。在这种情况下,我们已经开发出一种方法,可以从对齐的高分辨率侧翼和叠加的粗尺度边缘中有效地收集轮廓统计信息,以最佳地使用“上下文”线索。在此过程中,我们发现有证据表明,可以通过级联简单的2输入函数来实现跨尺度的这些上下文线索的集成,从而大大简化了我们的统计驱动方法。我们发现,两个共线的侧翼之间的交互类似于最小相似操作,中心响应,侧翼的交互是连接/上下文的,对于跨尺度交互的特殊情况,我们看到了最小的交互。我们将针对共线提示开发的方法推广到针对弯曲轮廓的提示。结果得到的上下文增强轮廓操作符强烈强调了自然场景中发现的空间扩展轮廓,同样具有零参数。我们还描述了一种基于Cornsweet错觉的新颖图像增强方法,该方法使用了从上述两种方法获得的轮廓。

著录项

  • 作者单位

    University of Southern California.;

  • 授予单位 University of Southern California.;
  • 学科 Engineering Biomedical.;Biology Neuroscience.
  • 学位 Ph.D.
  • 年度 2012
  • 页码 77 p.
  • 总页数 77
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号