Full metadata record

DC Field Value Language
dc.contributor.authorChen, Yuli-
dc.contributor.authorMa, Yide-
dc.contributor.authorKim, Dong Hwan-
dc.contributor.authorPark, Sung-Kee-
dc.date.accessioned2024-01-20T06:32:02Z-
dc.date.available2024-01-20T06:32:02Z-
dc.date.created2021-09-05-
dc.date.issued2015-08-
dc.identifier.issn2162-237X-
dc.identifier.urihttps://pubs.kist.re.kr/handle/201004/125161-
dc.description.abstractIn this paper, we propose a region-based object recognition (RBOR) method to identify objects from complex real-world scenes. First, the proposed method performs color image segmentation by a simplified pulse-coupled neural network (SPCNN) for the object model image and test image, and then conducts a region-based matching between them. Hence, we name it as RBOR with SPCNN (SPCNN-RBOR). Hereinto, the values of SPCNN parameters are automatically set by our previously proposed method in terms of each object model. In order to reduce various light intensity effects and take advantage of SPCNN high resolution on low intensities for achieving optimized color segmentation, a transformation integrating normalized Red Green Blue (RGB) with opponent color spaces is introduced. A novel image segmentation strategy is suggested to group the pixels firing synchronously throughout all the transformed channels of an image. Based on the segmentation results, a series of adaptive thresholds, which is adjustable according to the specific object model is employed to remove outlier region blobs, form potential clusters, and refine the clusters in test images. The proposed SPCNN-RBOR method overcomes the drawback of feature-based methods that inevitably includes background information into local invariant feature descriptors when keypoints locate near object boundaries. A large number of experiments have proved that the proposed SPCNN-RBOR method is robust for diverse complex variations, even under partial occlusion and highly cluttered environments. In addition, the SPCNN-RBOR method works well in not only identifying textured objects, but also in less-textured ones, which significantly outperforms the current feature-based methods.-
dc.languageEnglish-
dc.publisherIEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC-
dc.subjectCOUPLED NEURAL-NETWORKS-
dc.subjectPERFORMANCE EVALUATION-
dc.subjectLINKING-
dc.subjectMODELS-
dc.titleRegion-Based Object Recognition by Color Segmentation Using a Simplified PCNN-
dc.typeArticle-
dc.identifier.doi10.1109/TNNLS.2014.2351418-
dc.description.journalClass1-
dc.identifier.bibliographicCitationIEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, v.26, no.8, pp.1682 - 1697-
dc.citation.titleIEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS-
dc.citation.volume26-
dc.citation.number8-
dc.citation.startPage1682-
dc.citation.endPage1697-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.identifier.wosid000358224200009-
dc.identifier.scopusid2-s2.0-85027956612-
dc.relation.journalWebOfScienceCategoryComputer Science, Artificial Intelligence-
dc.relation.journalWebOfScienceCategoryComputer Science, Hardware & Architecture-
dc.relation.journalWebOfScienceCategoryComputer Science, Theory & Methods-
dc.relation.journalWebOfScienceCategoryEngineering, Electrical & Electronic-
dc.relation.journalResearchAreaComputer Science-
dc.relation.journalResearchAreaEngineering-
dc.type.docTypeArticle-
dc.subject.keywordPlusCOUPLED NEURAL-NETWORKS-
dc.subject.keywordPlusPERFORMANCE EVALUATION-
dc.subject.keywordPlusLINKING-
dc.subject.keywordPlusMODELS-
dc.subject.keywordAuthorFeature-based method-
dc.subject.keywordAuthorimage segmentation-
dc.subject.keywordAuthorobject recognition-
dc.subject.keywordAuthorregion-based matching-
dc.subject.keywordAuthorsimplified pulse-coupled neural network (SPCNN)-
Appears in Collections:
KIST Article > 2015
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE