首页> 美国卫生研究院文献>Sensors (Basel Switzerland) >Reliable Fusion of Stereo Matching and Depth Sensor for High Quality Dense Depth Maps
【2h】

Reliable Fusion of Stereo Matching and Depth Sensor for High Quality Dense Depth Maps

机译:立体匹配和深度传感器的可靠融合可提供高质量的密集深度图

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

Depth estimation is a classical problem in computer vision, which typically relies on either a depth sensor or stereo matching alone. The depth sensor provides real-time estimates in repetitive and textureless regions where stereo matching is not effective. However, stereo matching can obtain more accurate results in rich texture regions and object boundaries where the depth sensor often fails. We fuse stereo matching and the depth sensor using their complementary characteristics to improve the depth estimation. Here, texture information is incorporated as a constraint to restrict the pixel’s scope of potential disparities and to reduce noise in repetitive and textureless regions. Furthermore, a novel pseudo-two-layer model is used to represent the relationship between disparities in different pixels and segments. It is more robust to luminance variation by treating information obtained from a depth sensor as prior knowledge. Segmentation is viewed as a soft constraint to reduce ambiguities caused by under- or over-segmentation. Compared to the average error rate 3.27% of the previous state-of-the-art methods, our method provides an average error rate of 2.61% on the Middlebury datasets, which shows that our method performs almost 20% better than other “fused” algorithms in the aspect of precision.
机译:深度估计是计算机视觉中的经典问题,通常仅依赖于深度传感器或立体匹配。深度传感器在立体声匹配无效的重复和无纹理区域提供实时估计。但是,立体匹配可以在深度传感器经常失效的丰富纹理区域和对象边界中获得更准确的结果。我们将立体声匹配和深度传感器的互补特性融合在一起,以改善深度估计。在这里,纹理信息作为一种约束被纳入其中,以限制像素的潜在视差范围并减少重复和无纹理区域中的噪声。此外,使用新颖的伪两层模型来表示不同像素和片段中视差之间的关系。通过将从深度传感器获得的信息视为先验知识,它对亮度变化更加鲁棒。分割被视为一种软约束,可以减少分割不足或分割过多所引起的歧义。与以前的最新方法的平均错误率3.27%相比,我们的方法在Middlebury数据集上的平均错误率达到2.61%,这表明我们的方法比其他“融合”方法的性能高出近20%精度方面的算法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号