首页> 外文会议>International Conference on 3D Immersion >Stereoscopic Dataset from A Video Game: Detecting Converged Axes and Perspective Distortions in S3D Videos
【24h】

Stereoscopic Dataset from A Video Game: Detecting Converged Axes and Perspective Distortions in S3D Videos

机译:来自视频游戏的立体数据集:检测S3D视频中的融合轴和透视扭曲

获取原文

摘要

This paper presents a method for generating stereoscopic or multi-angle video frames using a computer game (Grand Theft Auto V). We developed a mod that captures synthetic frames allows us to create geometric distortions like those that occur in a real video. These distortions are the main cause of viewer discomfort when watching 3D movies. Datasets generated in this way can aid in solving problems related to machine-learning-based assessment of stereoscopic- or multi-angle-video quality. We trained a convolutional neural network to evaluate perspective distortions and converged camera axes in stereoscopic video, then tested it on real 3D movies. The neural network discovered multiple examples of these distortions.
机译:本文介绍了一种使用计算机游戏(Grand Tept Auto V)产生立体或多角度视频帧的方法。我们开发了一个捕获合成帧的mod,允许我们创建几何失真,如在真实视频中发生的几何失真。这些扭曲是观看3D电影时观察者不适的主要原因。以这种方式生成的数据集可以帮助解决与基于机器学习的基于立体或多角度 - 视频质量相关的问题。我们培训了卷积神经网络,以评估立体视频中的透视扭曲和融合的相机轴,然后在真正的3D电影上测试。神经网络发现了这些扭曲的多个例子。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号