File(s) under permanent embargo
Reason: This item is currently closed access.
Content-adaptive enhancement of multi-view depth maps for free viewpoint video
journal contributionposted on 2016-10-10, 09:12 authored by Erhan EkmekciogluErhan Ekmekcioglu, Vladan Velisavljevic, Stewart T. Worrall
Depth map estimation is an important part of the multi-view video coding and virtual view synthesis within the free viewpoint video applications. However, computing an accurate depth map is a computationally complex process, which makes real-time implementation challenging. Alternatively, a simple estimation, though quick and promising for real-time processing, might result in inconsistent multi-view depth map sequences. To exploit this simplicity and to improve the quality of depth map estimation, we propose a novel content adaptive enhancement technique applied to the previously estimated multi-view depth map sequences. The enhancement method is locally adapted to edges, motion and depth-range of the scene to avoid blurring the synthesized views and to reduce the computational complexity. At the same time, and very importantly, the method enforces consistency across the spatial, temporal and inter-view dimensions of the depth maps so that both the coding efficiency and the quality of the synthesized views are improved. We demonstrate these improvements in the experiments, where the enhancement method is applied to several multi-view test sequences and the obtained synthesized views are compared to the views synthesized using other methods in terms of both numerical and perceived visual quality.
- Loughborough University London
Published inSelected Topics in Signal Processing, IEEE Journal of
Pages352 - 361
CitationEKMEKCIOGLU, E., VELISAVLJEVIC, V. and WORRALL, S., 2011. Content-adaptive enhancement of multi-view depth maps for free viewpoint video. Selected Topics in Signal Processing, IEEE Journal of, 5 (2), pp.352-361.
- VoR (Version of Record)
Publisher statementThis work is made available according to the conditions of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) licence. Full details of this licence are available at: https://creativecommons.org/licenses/by-nc-nd/4.0/