Point-based View Synthesis from RGB-D Images

Authors
Kim, JinwookPark, Anjin
Issue Date
2014-10
Publisher
IEEE
Citation
5th International Conference on Information and Communication Technology Convergence (ICTC), pp.974 - 975
Abstract
This paper proposes a method generating novel views from a set of RGB-D images captured by static multiple depth cameras. The novel views were generally synthesized by back-projecting the real 3D points onto the virtual image plane located at the required viewing position. In this case, empty pixels and holes were visualized when the 3D scene is zoomed-in by the user. To solve this problem, the proposed method constructs 3D structures from point clouds converted from depth images, and renders novel views from the constructed point clouds based on RGB-D images. In the experiments, the proposed method can reduce excessively blurred images, compared with the previous point-based rendering methods.
ISSN
2162-1233
URI
https://pubs.kist.re.kr/handle/201004/115324
Appears in Collections:
KIST Conference Paper > 2014
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE