首页> 外文会议>International Conference on Optoelectronic Imaging and Multimedia Technology >Interactive Gigapixel Video Streaming via Multiscale Acceleration
【24h】

Interactive Gigapixel Video Streaming via Multiscale Acceleration

机译:通过MultiScale Acceleration互动千兆像素视频流

获取原文

摘要

Immersive video applications grow faster for users to freely navigate within a virtualized 3D environment forentertainment, productivity, training, etc. Fundamentally, such system can be facilitated by an interactiveGigapixel Video Streaming (iGVS) platform from array camera capturing to end user interaction. This interactivesystem demands a large amount of network bandwidth to sustain the reliable service provisioning, hinderingits massive market adoption. Thus, we propose to segment the gigapixel scene into non-overlapped spatialtiles. Each tile only covers a sub-region of the entire scene. One or more tiles will be used to represent aninstantaneous viewport interested by a speci c user. Tiles are then encoded at a variety of quality scales usingvarious combinations of spatial, temporal and amplitude resolutions (STAR), which are typically encapsulatedinto temporally-aligned tile video chunks (or simply chunks). Chunks at di erent quality level can be processedin parallel for real-time purpose. With such setup, diverse chunk combinations can be simultaneously accessed byheterogeneous user per its request, and viewport-adaptation based content navigation in an immersive space canbe also realized by adapting multiscale chunks properly, under the bandwidth constraints. A serial computationalvision models measuring the perceptual quality of viewport video in terms of its quality scales, adaptation factors,as well as the peripheral vision thresholds, are devised to prepare and guide the chunk adaptation for the bestperceptual quality index. Furthermore, in response to the time-varying network, a deep reinforcement learning(DRL) based adaptive real-time streaming (ARS) scheme is developed, by learning the future decision fromthe historical network states, to maximize the overall quality of experience (QoE) in a practical Internet-basedstreaming scenario. Our experiments have revealed that averaged QoE can be improved by about 60%, andits standard deviation can be also reduced by 30%, in comparison to the popular Google congestion controlalgorithm widely adopted in existing system for adaptive streaming, demonstrating the e ciency of our multiscaleaccelerated iGVS for immersive video application.
机译:沉浸式视频应用程序可以在虚拟化的3D环境中自由导航更快地增长娱乐,生产力,培训等从根本上,可以通过互动性促进这种系统Gigapixel视频流(IGVS)平台从阵列摄像机捕获到最终用户交互。这个互动系统要求大量网络带宽维持可靠的服务供应,阻碍它的大规模市场采用。因此,我们建议将千兆像素场景分段为非重叠空间瓷砖。每个瓦片仅涵盖整个场景的子区域。一个或多个瓷砖将用于表示由Speci C用户感兴趣的瞬时视口。然后在使用各种质量尺度上编码瓷砖通常封装的空间,时间和幅度分辨率(星)的各种组合进入时间对齐的瓷砖视频块(或只是块)。可以处理DI Erent质量水平的大块与实际目的并行。通过这种设置,可以同时访问不同的块组合根据其请求的异构用户,以及在沉浸式空间中的基于视口适应的内容导航在带宽约束下,也通过适当地调整多尺度块。串行计算在其质量尺度,适应因素方面测量视口视频的感知质量的视觉模型,以及外围视觉阈值,设计为准备和引导块适应感知质量指数。此外,响应于时变网络,深度增强学习(DRL)基于自适应的实时流(ARS)方案是开发的,通过学习未来的决定历史网络状态,以实际互联网的基于实际互联网的整体体验质量(QoE)流式场景。我们的实验表明,平均QoE可以提高约60%,与流行的谷歌拥塞控制相比,其标准偏差也可以减少30%在现有系统中广泛采用的算法进行自适应流媒体,展示我们多尺度的效率加速IGV用于沉浸式视频应用。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号