Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Jung, Dae-Hyun | - |
dc.contributor.author | Kim, Cheoul Young | - |
dc.contributor.author | Lee, Taek Sung | - |
dc.contributor.author | Park, Soo Hyun | - |
dc.date.accessioned | 2024-01-19T12:00:54Z | - |
dc.date.available | 2024-01-19T12:00:54Z | - |
dc.date.created | 2022-06-30 | - |
dc.date.issued | 2022-06 | - |
dc.identifier.issn | 1746-4811 | - |
dc.identifier.uri | https://pubs.kist.re.kr/handle/201004/115131 | - |
dc.description.abstract | Background On tomato plants, the flowering truss is a group or cluster of smaller stems where flowers and fruit develop, while the growing truss is the most extended part of the stem. Because the state of the growing truss reacts sensitively to the surrounding environment, it is essential to control its growth in the early stages. With the recent development of information and artificial intelligence technology in agriculture, a previous study developed a real-time acquisition and evaluation method for images using robots. Furthermore, we used image processing to locate the growing truss to extract growth information. Among the different vision algorithms, the CycleGAN algorithm was used to generate and transform unpaired images using generated learning images. In this study, we developed a robot-based system for simultaneously acquiring RGB and depth images of the growing truss of the tomato plant. Results The segmentation performance for approximately 35 samples was compared via false negative (FN) and false positive (FP) indicators. For the depth camera image, we obtained FN and FP values of 17.55 +/- 3.01% and 17.76 +/- 3.55%, respectively. For the CycleGAN algorithm, we obtained FN and FP values of 19.24 +/- 1.45% and 18.24 +/- 1.54%, respectively. When segmentation was performed via image processing through depth image and CycleGAN, the mean intersection over union (mIoU) was 63.56 +/- 8.44% and 69.25 +/- 4.42%, respectively, indicating that the CycleGAN algorithm can identify the desired growing truss of the tomato plant with high precision. Conclusions The on-site possibility of the image extraction technique using CycleGAN was confirmed when the image scanning robot drove in a straight line through a tomato greenhouse. In the future, the proposed approach is expected to be used in vision technology to scan tomato growth indicators in greenhouses using an unmanned robot platform. | - |
dc.language | English | - |
dc.publisher | BioMed Central | - |
dc.title | Depth image conversion model based on CycleGAN for growing tomato truss identification | - |
dc.type | Article | - |
dc.identifier.doi | 10.1186/s13007-022-00911-0 | - |
dc.description.journalClass | 1 | - |
dc.identifier.bibliographicCitation | Plant Methods, v.18, no.1 | - |
dc.citation.title | Plant Methods | - |
dc.citation.volume | 18 | - |
dc.citation.number | 1 | - |
dc.description.isOpenAccess | Y | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.identifier.wosid | 000812520800001 | - |
dc.relation.journalWebOfScienceCategory | Biochemical Research Methods | - |
dc.relation.journalWebOfScienceCategory | Plant Sciences | - |
dc.relation.journalResearchArea | Biochemistry & Molecular Biology | - |
dc.relation.journalResearchArea | Plant Sciences | - |
dc.type.docType | Article | - |
dc.subject.keywordAuthor | Generative adversarial networks | - |
dc.subject.keywordAuthor | Convolutional neural network | - |
dc.subject.keywordAuthor | Robot platform | - |
dc.subject.keywordAuthor | Deep learning | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.