Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Lee, Jae-Hun | - |
dc.contributor.author | Kim, Jae-Yoon | - |
dc.contributor.author | Ryu, Kanghyun | - |
dc.contributor.author | Al-masni, Mohammed A. | - |
dc.contributor.author | Kim, Tae Hyung | - |
dc.contributor.author | Han, Dongyeob | - |
dc.contributor.author | Kim, Hyun Gi | - |
dc.contributor.author | Kim, Dong-Hyun | - |
dc.date.accessioned | 2024-03-07T02:30:14Z | - |
dc.date.available | 2024-03-07T02:30:14Z | - |
dc.date.created | 2024-03-07 | - |
dc.date.issued | 2024-06 | - |
dc.identifier.issn | 0740-3194 | - |
dc.identifier.uri | https://pubs.kist.re.kr/handle/201004/149399 | - |
dc.description.abstract | Purpose: We introduced a novel reconstruction network, jointly unrolled cross-domain optimization-based spatio-temporal reconstruction network (JUST-Net), aimed at accelerating 3D multi-echo gradient-echo (mGRE) data acquisition and improving the quality of resulting myelin water imaging (MWI) maps. Method: n unrolled cross-domain spatio-temporal reconstruction network was designed. The main idea is to combine frequency and spatio-temporal image feature representations and to sequentially implement convolution layers in both domains. The k-space subnetwork utilizes shared information from adjacent frames, whereas the image subnetwork applies separate convolutions in both spatial and temporal dimensions. The proposed reconstruction network was evaluated for both retrospectively and prospectively accelerated acquisition. Furthermore, it was assessed in simulation studies and real-world cases with k-space corruptions to evaluate its potential for motion artifact reduction. Results: The proposed JUST-Net enabled highly reproducible and accelerated 3D mGRE acquisition for whole-brain MWI, reducing the acquisition time from fully sampled 15:23 to 2:22 min within a 3-min reconstruction time. The normalized root mean squared error of the reconstructed mGRE images increased by less than 4.0%, and the correlation coefficients for MWI showed a value of over 0.68 when compared to the fully sampled reference. Additionally, the proposed method demonstrated a mitigating effect on both simulated and clinical motion-corrupted cases. Conclusion: The proposed JUST-Net has demonstrated the capability to achieve high acceleration factors for 3D mGRE-based MWI, which is expected to facilitate widespread clinical applications of MWI. | - |
dc.language | English | - |
dc.publisher | John Wiley & Sons Inc. | - |
dc.title | JUST-Net: Jointly unrolled cross-domain optimization based spatio-temporal reconstruction network for accelerated 3D myelin water imaging | - |
dc.type | Article | - |
dc.identifier.doi | 10.1002/mrm.30021 | - |
dc.description.journalClass | 1 | - |
dc.identifier.bibliographicCitation | Magnetic Resonance in Medicine, v.91, no.6, pp.2483 - 2497 | - |
dc.citation.title | Magnetic Resonance in Medicine | - |
dc.citation.volume | 91 | - |
dc.citation.number | 6 | - |
dc.citation.startPage | 2483 | - |
dc.citation.endPage | 2497 | - |
dc.description.isOpenAccess | Y | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.identifier.wosid | 001161079700001 | - |
dc.identifier.scopusid | 2-s2.0-85185149555 | - |
dc.relation.journalWebOfScienceCategory | Radiology, Nuclear Medicine & Medical Imaging | - |
dc.relation.journalResearchArea | Radiology, Nuclear Medicine & Medical Imaging | - |
dc.type.docType | Article | - |
dc.subject.keywordPlus | FRACTION | - |
dc.subject.keywordPlus | BRAIN | - |
dc.subject.keywordPlus | MODEL | - |
dc.subject.keywordPlus | SENSE | - |
dc.subject.keywordAuthor | accelerating 3D multi-echo gradient-echo | - |
dc.subject.keywordAuthor | deep learning | - |
dc.subject.keywordAuthor | myelin water imaging | - |
dc.subject.keywordAuthor | prospectively accelerated acquisition | - |
dc.subject.keywordAuthor | reproducibility test | - |
dc.subject.keywordAuthor | spatio-temporal reconstruction | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.