Full metadata record

DC Field Value Language
dc.contributor.authorKim, Minsu-
dc.contributor.authorKim, Giseop-
dc.contributor.authorChoi, Sunwook-
dc.date.accessioned2024-11-07T02:00:14Z-
dc.date.available2024-11-07T02:00:14Z-
dc.date.created2024-11-06-
dc.date.issued2024-11-
dc.identifier.issn2377-3766-
dc.identifier.urihttps://pubs.kist.re.kr/handle/201004/150986-
dc.description.abstractRecent advancements in Bird's Eye View (BEV) fusion for map construction have demonstrated remarkable mapping of urban environments. However, their deep and bulky architecture incurs substantial amounts of backpropagation memory and computing latency. Consequently, the problem poses an unavoidable bottleneck in constructing high-resolution (HR) BEV maps, as their large-sized features cause significant increases in costs including GPU memory consumption and computing latency, named diverging training costs issue. Affected by the problem, most existing methods adopt low-resolution (LR) BEV and struggle to estimate the precise locations of urban scene components like road lanes, and sidewalks. As the imprecision leads to risky motion planning like collision avoidance, the diverging training costs issue has to be resolved. In this letter, we address the issue with our novel BEVRestore mechanism. Specifically, our proposed model encodes the features of each sensor to LR BEV space and restores them to HR space to establish a memory-efficient map constructor. To this end, we introduce the BEV restoration strategy, which restores aliasing, and blocky artifacts of the up-scaled BEV features, and narrows down the width of the labels. Our extensive experiments show that the proposed mechanism provides a plug-and-play, memory-efficient pipeline, enabling an HR map construction with a broad BEV scope. Our code will be publicly released.-
dc.languageEnglish-
dc.publisherInstitute of Electrical and Electronics Engineers Inc.-
dc.titleAddressing Diverging Training Costs Using BEVRestore for High-Resolution Bird's Eye View Map Construction-
dc.typeArticle-
dc.identifier.doi10.1109/LRA.2024.3474477-
dc.description.journalClass1-
dc.identifier.bibliographicCitationIEEE Robotics and Automation Letters, v.9, no.11, pp.10700 - 10707-
dc.citation.titleIEEE Robotics and Automation Letters-
dc.citation.volume9-
dc.citation.number11-
dc.citation.startPage10700-
dc.citation.endPage10707-
dc.description.isOpenAccessN-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.identifier.wosid001339096800011-
dc.identifier.scopusid2-s2.0-85207138888-
dc.relation.journalWebOfScienceCategoryRobotics-
dc.relation.journalResearchAreaRobotics-
dc.type.docTypeArticle-
dc.subject.keywordAuthorMapping-
dc.subject.keywordAuthorrange sensing-
dc.subject.keywordAuthorsensor fusion-
dc.subject.keywordAuthorsensor fusion-
dc.subject.keywordAuthorsensor fusion-
Appears in Collections:
KIST Article > 2024
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE