Planar Abstraction and Inverse Rendering of 3D Indoor Environments

Authors
Kim, Young MinRyu, SangwooKim, Ig-Jae
Issue Date
2021-06-01
Publisher
IEEE COMPUTER SOC
Citation
IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, v.27, no.6, pp.2992 - 3006
Abstract
Scanning and acquiring a 3D indoor environment suffers from complex occlusions and misalignment errors. The reconstruction obtained from an RGB-D scanner contains holes in geometry and ghosting in texture. These are easily noticeable and cannot be considered as visually compelling VR content without further processing. On the other hand, the well-known Manhattan World priors successfully recreate relatively simple structures. In this article, we would like to push the limit of planar representation in indoor environments. Given an initial 3D reconstruction captured by an RGB-D sensor, we use planes not only to represent the environment geometrically but also to solve an inverse rendering problem considering texture and light. The complex process of shape inference and intrinsic imaging is greatly simplified with the help of detected planes and yet produces a realistic 3D indoor environment. The generated content can adequately represent the spatial arrangements for various AR/VR applications and can be readily composited with virtual objects possessing plausible lighting and texture.
Keywords
Geometry; Rendering (computer graphics); Indoor environment; Three-dimensional displays; Lighting; Image reconstruction; Semantics; 3D content creation; indoor modeling; texture generation; inverse rendering
ISSN
1077-2626
URI
https://pubs.kist.re.kr/handle/201004/116877
DOI
10.1109/TVCG.2019.2960776
Appears in Collections:
KIST Article > 2021
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE