Human-robot interaction in real environments by audio-visual integration
- Authors
- Kim, Hyun-Don; Choi, Jong-Suk; Kim, Munsang
- Issue Date
- 2007-02
- Publisher
- INST CONTROL ROBOTICS & SYSTEMS, KOREAN INST ELECTRICAL ENGINEERS
- Citation
- INTERNATIONAL JOURNAL OF CONTROL AUTOMATION AND SYSTEMS, v.5, no.1, pp.61 - 69
- Abstract
- In this paper, we developed not only a reliable sound localization system including a VAD (Voice Activity Detection) component using three microphones but also a face tracking system using a vision camera. Moreover, we proposed a way to integrate three systems in the human-robot interaction to compensate errors in the localization of a speaker and to reject unnecessary speech or noise signals entering from undesired directions effectively. For the put-pose of verifying Our system's performances, we installed the proposed audio-visual system in a prototype robot, called IROBAA (Intelligent ROBot for Active Audition), and demonstrated how to integrate the audio-visual system.
- Keywords
- audio-visual integration; face tracking; human-robot interaction; sound source localization; voice activity detection
- ISSN
- 1598-6446
- URI
- https://pubs.kist.re.kr/handle/201004/134688
- Appears in Collections:
- KIST Article > 2007
- Files in This Item:
There are no files associated with this item.
- Export
- RIS (EndNote)
- XLS (Excel)
- XML
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.