Full metadata record

DC Field Value Language
dc.contributor.authorLee, Sangmoon-
dc.contributor.authorPark, Youngjin-
dc.contributor.authorChoi, Jong-Suk-
dc.date.accessioned2024-01-20T10:30:17Z-
dc.date.available2024-01-20T10:30:17Z-
dc.date.created2021-09-05-
dc.date.issued2014-03-
dc.identifier.issn0003-682X-
dc.identifier.urihttps://pubs.kist.re.kr/handle/201004/127049-
dc.description.abstractEstimating the direction of a sound source is an important technique used in various engineering fields, including intelligent robots and surveillance systems. In a household where a user's voice and noises emitted from electric appliances originate from arbitrary directions in 3-D space, robots need to recognize the directions of multiple sound sources in order to effectively interact with the user. This paper proposes an ear-based estimation (localization) system using two artificial robot ears, each consisting of a spiral-shaped pinna and two microphones, for application in humanoid robots. Four microphones are asymmetrically placed on the left and right sides of the head. The proposed localization algorithm is based on a spatially mapped generalized cross-correlation function which is transformed from the time domain to the space domain by using a measured inter-channel time difference map. For validation of the proposed localization method, two experiments (single- and multiple-source cases) were conducted using male speech. In the case of a single source, with the exception of laterally biased sources, the localization was achieved with an error of less than 10 degrees. In a multiple-source environment, one source was fixed at the front side and the other source changed its direction; from the experimental results, the error rates on the localization of the fixed and varying sources are 0% and 36.9% respectively within an error bound of 15 degrees. (C) 2013 Elsevier Ltd. All rights reserved.-
dc.languageEnglish-
dc.publisherELSEVIER SCI LTD-
dc.subjectSOURCE LOCALIZATION-
dc.subjectTRACKING-
dc.titleEstimation of multiple sound source directions using artificial robot ears-
dc.typeArticle-
dc.identifier.doi10.1016/j.apacoust.2013.10.001-
dc.description.journalClass1-
dc.identifier.bibliographicCitationAPPLIED ACOUSTICS, v.77, pp.49 - 58-
dc.citation.titleAPPLIED ACOUSTICS-
dc.citation.volume77-
dc.citation.startPage49-
dc.citation.endPage58-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.identifier.wosid000329540200007-
dc.identifier.scopusid2-s2.0-84887302620-
dc.relation.journalWebOfScienceCategoryAcoustics-
dc.relation.journalResearchAreaAcoustics-
dc.type.docTypeArticle-
dc.subject.keywordPlusSOURCE LOCALIZATION-
dc.subject.keywordPlusTRACKING-
dc.subject.keywordAuthorArtificial robot ear-
dc.subject.keywordAuthorEstimation of multiple-source directions-
dc.subject.keywordAuthorHumanoid robots-
dc.subject.keywordAuthorSpatially mapped generalized cross-correlation function-
dc.subject.keywordAuthorHuman-robot interaction-
Appears in Collections:
KIST Article > 2014
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE