Audio-visual Data Fusion for Tracking the Direction of Multple Speakers

Title
Audio-visual Data Fusion for Tracking the Direction of Multple Speakers
Authors
뉴옌반쾅최종석
Keywords
audio-visual data fusion; sound source localization; particle filter; speaker tracking
Issue Date
2010-10
Publisher
International Conference on Control, Automation and Systems
Citation
, 1626-1630
Abstract
This paper presents a multi-speakers tracking algorithm using audio-visual data fusion. The audio information is the direction of speakers and the visual information is the direction of detected faces. These observations are used as inputs of the tracking algorithm, which employed the framework of particle filter. For multi-target tracking, we present a flexible data association and data fusion, which can deal with the appearance or absent of any information during tracking process. The experimental results on data collected from a robot platform in a conventional office room confirm a potential application for human-robot interaction.
URI
http://pubs.kist.re.kr/handle/201004/38579
Appears in Collections:
KIST Publication > Conference Paper
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML


qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE