首页> 中文期刊> 《中国科学》 >Neural compositing for real-time augmented reality rendering in low-frequency lighting environments

Neural compositing for real-time augmented reality rendering in low-frequency lighting environments

         

摘要

We present neural compositing, a deep-learning based method for augmented reality rendering,which uses convolutional neural networks to composite rendered layers of a virtual object with a real photograph to emulate shadow and reflection effects. The method starts from estimating the lighting and roughness information from the photograph using neural networks, renders the virtual object with a virtual floor into color, shadow and reflection layers by applying the estimated lighting, and finally refines the reflection and shadow layers using neural networks and blends them with the color layer and input image to yield the output image. We assume low-frequency lighting environments and adopt PRT(precomputed radiance transfer) for layer rendering, which makes the whole pipeline differentiable and enables fast end-to-end network training with synthetic scenes. Working on a single photograph, our method can produce realistic reflections in a real scene with spatially-varying material and cast shadows on background objects with unknown geometry and material at real-time frame rates.

著录项

相似文献

  • 中文文献
  • 外文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号