Full metadata record

DC Field Value Language
dc.contributor.authorQuang Nguyen-
dc.contributor.authorChoi, JongSuk-
dc.date.accessioned2024-01-20T03:33:55Z-
dc.date.available2024-01-20T03:33:55Z-
dc.date.created2022-01-10-
dc.date.issued2016-08-
dc.identifier.issn0921-0296-
dc.identifier.urihttps://pubs.kist.re.kr/handle/201004/123810-
dc.description.abstractRobotic auditory attention mainly relies on sound source localization using a microphone array. Typically, the robot detects a sound source whenever it emits, estimates its direction, and then turns to that direction to pay attention. However, in scenarios where multiple sound sources emit simultaneously, the robot may have difficulty with selecting a single target source. This paper proposes a novel robot auditory attention system that is based on source distance perception (e.g., selection of the closest among localized sources). Microphone array consists of head- and base-arrays installed in the robot's head and base, respectively. The difficulty in the attention among multiple sound sources is solved by estimating a binary mask for each source based on the azimuth localization of the head-array. For each individual source represented by a binary mask, elevations of head- and base-array are estimated and triangulated to obtain distance to the robot. Finally, the closest source is determined and its direction is used for controlling the robot. Experiment results clearly show the benefit of the proposed system, on real indoor recordings of two and three simultaneous sound sources, as well as real-time demonstration at a robot exhibition.-
dc.languageEnglish-
dc.publisherSPRINGER-
dc.titleSelection of the Closest Sound Source for Robot Auditory Attention in Multi-source Scenarios-
dc.typeArticle-
dc.identifier.doi10.1007/s10846-015-0313-0-
dc.description.journalClass1-
dc.identifier.bibliographicCitationJOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS, v.83, no.2, pp.239 - 251-
dc.citation.titleJOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS-
dc.citation.volume83-
dc.citation.number2-
dc.citation.startPage239-
dc.citation.endPage251-
dc.description.isOpenAccessN-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.identifier.wosid000379229900006-
dc.identifier.scopusid2-s2.0-84948149305-
dc.relation.journalWebOfScienceCategoryComputer Science, Artificial Intelligence-
dc.relation.journalWebOfScienceCategoryRobotics-
dc.relation.journalResearchAreaComputer Science-
dc.relation.journalResearchAreaRobotics-
dc.type.docTypeArticle-
dc.subject.keywordPlusSOURCE LOCALIZATION-
dc.subject.keywordPlusMOBILE ROBOT-
dc.subject.keywordPlusSPEECH-
dc.subject.keywordPlusSEPARATION-
dc.subject.keywordPlusTDOA-
dc.subject.keywordPlusTRACKING-
dc.subject.keywordAuthorSound source localization-
dc.subject.keywordAuthorDistance estimation-
dc.subject.keywordAuthorRobotic auditory attention-
dc.subject.keywordAuthorHuman-robot interaction-
Appears in Collections:
KIST Article > 2016
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE