Depth error compensation for camera fusion system

Title
Depth error compensation for camera fusion system
Authors
이천김성열최병호권용무호요성
Keywords
3차원 영상; 깊이맵; 카메라 퓨전; 깊이 카메라; 스테레오 카메라; camera fusion system; time-of-flight camera; depth camera; depth map generation; depth error reduction
Issue Date
2013-07
Publisher
Optical engineering
Citation
VOL 52, NO 7, 073103-1-073103-13
Abstract
When the three-dimensional (3-D) video system includes a multiview video generation technique using depth data to provide more realistic 3-D viewing experiences, accurate depth map acquisition is an important task. In order to generate the precise depth map in real time, we can build a camera fusion system with multiple color cameras and one time-of-flight (TOF) camera; however, this method is associated with depth errors, such as depth flickering, empty holes in the warped depth map, and mixed pixels around object boundaries. In this paper, we propose three different methods for depth error reduction to minimize such depth errors. In order to reduce depth flickering in the temporal domain, we propose a temporal enhancement method using a modified joint bilateral filtering at the TOF camera side. Then, we fill the empty holes in the warped depth map by selecting a virtual depth and applying a weighted depth filtering method. After hole filling, we remove mixed pixels and replace them with new depth values using an adaptive joint multilateral filter. Experimental results show that the proposed method reduces depth errors significantly in near real time.
URI
http://pubs.kist.re.kr/handle/201004/45202
ISSN
00913286
Appears in Collections:
KIST Publication > Article
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML


qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE