File(s) under permanent embargo
Reason: This item is currently closed access.
Efficient edge, motion and depth-range adaptive processing for enhancement of multi-view depth map sequences
conference contribution
posted on 2016-10-11, 14:57 authored by Erhan Ekmekcioglu, Vladan Velisavljevic, Stewart T. WorrallWe present a novel and efficient multi-view depth map enhancement
method proposed as a post-processing of initially estimated depth
maps. The proposed method is based on edge, motion and scene
depth-range adaptive median filtering and allows for an improved
quality of virtual view synthesis. To enforce the spatial, temporal
and inter-view coherence in the multi-view depth maps, the median
filtering is applied to 4-dimensional windows that consist of the spatially
neighboring depth map values taken at different viewpoints
and time instants. A fast iterative block segmentation approach is
adopted to adaptively shrink these windows in the presence of edges
and motion for preservation of sharpness and realistic rendering and
for improvement of the compression efficiency. We show that our
enhancement method leads to a reduction of the coding bit-rate required
for representation of the depth maps and also leads to a gain
in the quality of synthesized views at arbitrary virtual viewpoints.
Funding
This work was in part developed within VISNET II, a European Network of Excellence (http://www.visnetnoe.org), funded under the European Commission IST FP6 programme.
History
School
- Loughborough University London
Published in
Image Processing (ICIP), 2009 16th IEEE International Conference onPages
3537 - 3540Citation
EKMEKCIOGLU, E., VELISAVLJEVIC, V. and WORRALL, S., 2009. Efficient edge, motion and depth-range adaptive processing for enhancement of multi-view depth map sequences. IN: Proceedings of 2009 16th IEEE International Conference on Image Processing (ICIP 2009), Cairo, Egypt, 7-12 November 2009, pp.3537-3540.Publisher
© IEEEVersion
- VoR (Version of Record)
Publisher statement
This work is made available according to the conditions of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) licence. Full details of this licence are available at: https://creativecommons.org/licenses/by-nc-nd/4.0/Publication date
2009Notes
Closed access.ISBN
1424456533;9781424456543ISSN
1522-4880Publisher version
Language
- en