Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Quang Nguyen | - |
dc.contributor.author | Choi, JongSuk | - |
dc.date.accessioned | 2024-01-20T03:33:55Z | - |
dc.date.available | 2024-01-20T03:33:55Z | - |
dc.date.created | 2022-01-10 | - |
dc.date.issued | 2016-08 | - |
dc.identifier.issn | 0921-0296 | - |
dc.identifier.uri | https://pubs.kist.re.kr/handle/201004/123810 | - |
dc.description.abstract | Robotic auditory attention mainly relies on sound source localization using a microphone array. Typically, the robot detects a sound source whenever it emits, estimates its direction, and then turns to that direction to pay attention. However, in scenarios where multiple sound sources emit simultaneously, the robot may have difficulty with selecting a single target source. This paper proposes a novel robot auditory attention system that is based on source distance perception (e.g., selection of the closest among localized sources). Microphone array consists of head- and base-arrays installed in the robot's head and base, respectively. The difficulty in the attention among multiple sound sources is solved by estimating a binary mask for each source based on the azimuth localization of the head-array. For each individual source represented by a binary mask, elevations of head- and base-array are estimated and triangulated to obtain distance to the robot. Finally, the closest source is determined and its direction is used for controlling the robot. Experiment results clearly show the benefit of the proposed system, on real indoor recordings of two and three simultaneous sound sources, as well as real-time demonstration at a robot exhibition. | - |
dc.language | English | - |
dc.publisher | SPRINGER | - |
dc.title | Selection of the Closest Sound Source for Robot Auditory Attention in Multi-source Scenarios | - |
dc.type | Article | - |
dc.identifier.doi | 10.1007/s10846-015-0313-0 | - |
dc.description.journalClass | 1 | - |
dc.identifier.bibliographicCitation | JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS, v.83, no.2, pp.239 - 251 | - |
dc.citation.title | JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS | - |
dc.citation.volume | 83 | - |
dc.citation.number | 2 | - |
dc.citation.startPage | 239 | - |
dc.citation.endPage | 251 | - |
dc.description.isOpenAccess | N | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.identifier.wosid | 000379229900006 | - |
dc.identifier.scopusid | 2-s2.0-84948149305 | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Artificial Intelligence | - |
dc.relation.journalWebOfScienceCategory | Robotics | - |
dc.relation.journalResearchArea | Computer Science | - |
dc.relation.journalResearchArea | Robotics | - |
dc.type.docType | Article | - |
dc.subject.keywordPlus | SOURCE LOCALIZATION | - |
dc.subject.keywordPlus | MOBILE ROBOT | - |
dc.subject.keywordPlus | SPEECH | - |
dc.subject.keywordPlus | SEPARATION | - |
dc.subject.keywordPlus | TDOA | - |
dc.subject.keywordPlus | TRACKING | - |
dc.subject.keywordAuthor | Sound source localization | - |
dc.subject.keywordAuthor | Distance estimation | - |
dc.subject.keywordAuthor | Robotic auditory attention | - |
dc.subject.keywordAuthor | Human-robot interaction | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.