Human gaze-aware attentive object detection for ambient intelligence

Authors
Cho, Dae-YongKang, Min-Koo
Issue Date
2021-11
Publisher
PERGAMON-ELSEVIER SCIENCE LTD
Citation
ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, v.106
Abstract
Understanding human behavior and the surrounding environment is essential for realizing ambient intelligence (AmI), for which eye gaze and object information are reliable cues. In this study, the authors propose a novel human gaze-aware attentive object detection framework as an elemental technology for AmI. The proposed framework detects users' attentive objects and shows more precise and robust performance against object-scale variations. A novel Adaptive-3D-Region-of-Interest (Ada-3D-RoI) scheme is designed as a front-end module, and scalable detection network structures are proposed to maximize cost-efficiency. The experiments show that the detection rate is improved up to 97.6% on small objects (14.1% on average), and it is selectively tunable with a tradeoff between accuracy and computational complexity. In addition, the qualitative results demonstrate that the proposed framework detects a user's single object-of-interest only, even when the target object is occluded or extremely small. Complementary matters for follow-up study are presented as suggestions to extend the results of the proposed framework to further practical AmI applications. This study will help develop advanced AmI applications that demand a higher-level understanding of scene context and human behavior such as human-robot symbiosis, remote-/autonomous control, and augmented/mixed reality.
Keywords
Affective ambient intelligence; Human-computer interaction; Object recognition; Augmented and mixed reality
ISSN
0952-1976
URI
https://pubs.kist.re.kr/handle/201004/116228
DOI
10.1016/j.engappai.2021.104471
Appears in Collections:
KIST Article > 2021
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE