Speaker selection and tracking in a cluttered environment with audio and visual information

Title
Speaker selection and tracking in a cluttered environment with audio and visual information
Authors
임윤섭최종석
Keywords
speaker localization; audio/video fusion; particle filter; Human-Robot Interaction; Data Association
Issue Date
2009-08
Publisher
IEEE transactions on consumer electronics
Citation
VOL 55, NO 3, 1581-1589
Abstract
Presented in this paper is a data association method using audio and visual data which localizes targets in a cluttered environment and detects who is speaking to a robot. A particle filter is applied to efficiently select the optimal association between the target and the measurements. State variables are composed of target positions and speaking states. To update the speaking state, we first evaluate the incoming sound signal based on cross-correlation and then calculate a likelihood from the audio information. The visual measurement is used to find an optimal association between the target and the observed objects. The number of targets that the robot should interact with is updated from the existence probabilities and associations. Experimental data were collected beforehand and simulated on a computer to verify the performance of the proposed method applied to the speaker selection problem in a cluttered environment. The algorithm was also implemented in a robotic system to demonstrate reliable interactions between the robot and speaking targets.
URI
http://pubs.kist.re.kr/handle/201004/35832
ISSN
0098-3063
Appears in Collections:
KIST Publication > Article
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML


qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE