Full metadata record
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | 류제경 | - |
| dc.contributor.author | 김지엽 | - |
| dc.contributor.author | 박찬욱 | - |
| dc.contributor.author | 김해윤 | - |
| dc.contributor.author | 이득희 | - |
| dc.contributor.author | 한경원 | - |
| dc.date.accessioned | 2026-03-04T07:00:12Z | - |
| dc.date.available | 2026-03-04T07:00:12Z | - |
| dc.date.created | 2026-02-12 | - |
| dc.date.issued | 2026-02-05 | - |
| dc.identifier.issn | 1975-6291 | - |
| dc.identifier.uri | https://pubs.kist.re.kr/handle/201004/154396 | - |
| dc.description.abstract | Microneedles offer promising capabilities not only for minimally invasive drug delivery but also as effective bio-tissue grippers. However, achieving strong tissue fixation while minimizing tissue damage during insertion remains a significant challenge. In this study, we propose a novel microneedle geometry optimized for 3D printing, designed to maximize the Pull-Out-to-Penetration Ratio through a machine-learning-based optimization framework combined with finite element analysis. Experimental results show that the optimized geometry achieves a six-fold improvement in the objective metric relative to conventional conical designs, demonstrating enhanced tissue fixation while simultaneously reducing insertion-induced damage. This approach highlights the potential for customizable, low-pain microneedle designs across a broad range of biomedical applications. | - |
| dc.language | Korean | - |
| dc.publisher | 한국로봇학회 | - |
| dc.title | Development of a Machine-Learning-Driven Microneedle Design Methodology for Biological Tissue Grippers | - |
| dc.type | Conference | - |
| dc.description.journalClass | 2 | - |
| dc.identifier.bibliographicCitation | 제21회 한국로봇 종합학술대회 | - |
| dc.citation.title | 제21회 한국로봇 종합학술대회 | - |
| dc.citation.conferencePlace | KO | - |
| dc.citation.conferencePlace | 알펜시아 컨벤션센터 | - |
| dc.citation.conferenceDate | 2026-02-04 | - |
| dc.relation.isPartOf | 제21회 한국로봇 종합학술대회 논문집 | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.