Full metadata record

DC Field Value Language
dc.contributor.authorKim, Hanseob-
dc.contributor.authorAli, Ghazanfar-
dc.contributor.authorPastor, Andreas-
dc.contributor.authorLee, Myungho-
dc.contributor.authorKim, Gerard J.-
dc.contributor.authorHwang, Jae-In-
dc.date.accessioned2024-01-19T15:04:56Z-
dc.date.available2024-01-19T15:04:56Z-
dc.date.created2022-01-10-
dc.date.issued2021-03-
dc.identifier.issn2076-3417-
dc.identifier.urihttps://pubs.kist.re.kr/handle/201004/117301-
dc.description.abstractRealistic interactions with real objects (e.g., animals, toys, robots) in an augmented reality (AR) environment enhances the user experience. The common AR apps on the market achieve realistic interactions by superimposing pre-modeled virtual proxies on the real objects in the AR environment. This way user perceives the interaction with virtual proxies as interaction with real objects. However, catering to environment change, shape deformation, and view update is not a trivial task. Our proposed method uses the dynamic silhouette of a real object to enable realistic interactions. Our approach is practical, lightweight, and requires no additional hardware besides the device camera. For a case study, we designed a mobile AR application to interact with real animal dolls. Our scenario included a virtual human performing four types of realistic interactions. Results demonstrated our method's stability that does not require pre-modeled virtual proxies in case of shape deformation and view update. We also conducted a pilot study using our approach and reported significant improvements in user perception of spatial awareness and presence for realistic interactions with a virtual human.-
dc.languageEnglish-
dc.publisherMDPI-
dc.titleSilhouettes from Real Objects Enable Realistic Interactions with a Virtual Human in Mobile Augmented Reality-
dc.typeArticle-
dc.identifier.doi10.3390/app11062763-
dc.description.journalClass1-
dc.identifier.bibliographicCitationAPPLIED SCIENCES-BASEL, v.11, no.6-
dc.citation.titleAPPLIED SCIENCES-BASEL-
dc.citation.volume11-
dc.citation.number6-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.identifier.wosid000645694900001-
dc.identifier.scopusid2-s2.0-85103493004-
dc.relation.journalWebOfScienceCategoryChemistry, Multidisciplinary-
dc.relation.journalWebOfScienceCategoryEngineering, Multidisciplinary-
dc.relation.journalWebOfScienceCategoryMaterials Science, Multidisciplinary-
dc.relation.journalWebOfScienceCategoryPhysics, Applied-
dc.relation.journalResearchAreaChemistry-
dc.relation.journalResearchAreaEngineering-
dc.relation.journalResearchAreaMaterials Science-
dc.relation.journalResearchAreaPhysics-
dc.type.docTypeArticle-
dc.subject.keywordAuthoraugmented reality-
dc.subject.keywordAuthormobile AR-
dc.subject.keywordAuthordeep learning-
dc.subject.keywordAuthorsegmentation-
dc.subject.keywordAuthorrealistic interaction-
dc.subject.keywordAuthorvirtual human-
dc.subject.keywordAuthorperceptual issue-
dc.subject.keywordAuthorocclusion-
dc.subject.keywordAuthormulti-modal system-
dc.subject.keywordAuthoruser experience-
Appears in Collections:
KIST Article > 2021
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE