Full metadata record

DC Field Value Language
dc.contributor.authorRueckert, Tobias-
dc.contributor.authorRauber, David-
dc.contributor.authorMaerkl, Raphaela-
dc.contributor.authorKlausmann, Leonard-
dc.contributor.authorYildiran, Suemeyye R.-
dc.contributor.authorGutbrod, Max-
dc.contributor.authorNunes, Danilo Weber-
dc.contributor.authorMoreno, Alvaro Fernandez-
dc.contributor.authorLuengo, Imanol-
dc.contributor.authorStoyanov, Danail-
dc.contributor.authorToussaint, Nicolas-
dc.contributor.authorCho, Enki-
dc.contributor.authorKim, Hyeon Bae-
dc.contributor.authorChoo, Oh Sung-
dc.contributor.authorKim, Ka Young-
dc.contributor.authorKim, Seong Tae-
dc.contributor.authorArantes, Goncalo-
dc.contributor.authorSong, Kehan-
dc.contributor.authorZhu, Jianjun-
dc.contributor.authorXiong, Junchen-
dc.contributor.authorLin, Tingyi-
dc.contributor.authorKikuchik, Shunsuke-
dc.contributor.authorMatsuzakik, Hiroki-
dc.contributor.authorKouno, Atsushi-
dc.contributor.authorManesco, Joao Renato Ribeiro-
dc.contributor.authorPapa, Joao Paulo-
dc.contributor.authorChoi, Tae-Min-
dc.contributor.authorJeong, Tae Kyeong-
dc.contributor.authorPark, Juyoun-
dc.contributor.authorAlabi, Oluwatosin-
dc.contributor.authorWei, Meng-
dc.contributor.authorVercauteren, Tom-
dc.contributor.authorWu, Runzhi-
dc.contributor.authorXu, Mengya-
dc.contributor.authorWang, An-
dc.contributor.authorBai, Long-
dc.contributor.authorRen, Hongliang-
dc.contributor.authorYamlahip, Amine-
dc.contributor.authorHennighausen, Jakob-
dc.contributor.authorMaier-Hein, Lena-
dc.contributor.authorKondo, Satoshi-
dc.contributor.authorKasai, Satoshi-
dc.contributor.authorHirasawa, Kousuke-
dc.contributor.authorYang, Shu-
dc.contributor.authorWang, Yihui-
dc.contributor.authorChen, Hao-
dc.contributor.authorRodriguez, Santiago-
dc.contributor.authorAparicio, Nicolas-
dc.contributor.authorManrique, Leonardo-
dc.contributor.authorLyons, Juan Camilo-
dc.contributor.authorHosie, Olivia-
dc.contributor.authorAyobi, Nicolas-
dc.contributor.authorArbelaez, Pablo-
dc.contributor.authorLi, Yiping-
dc.contributor.authorAl Khalil, Yasmina-
dc.contributor.authorNasirihaghighi, Sahar-
dc.contributor.authorSpeidel, Stefanie-
dc.contributor.authorRueckert, Daniel-
dc.contributor.authorFeussner, Hubertus-
dc.contributor.authorWilhelm, Dirk-
dc.contributor.authorPalm, Christoph-
dc.date.accessioned2026-03-27T07:00:24Z-
dc.date.available2026-03-27T07:00:24Z-
dc.date.created2026-03-24-
dc.date.issued2026-03-
dc.identifier.issn1361-8415-
dc.identifier.urihttps://pubs.kist.re.kr/handle/201004/154511-
dc.description.abstractReliable recognition and localization of surgical instruments in endoscopic video recordings are foundational for a wide range of applications in computer- and robot-assisted minimally invasive surgery (RAMIS), including surgical training, skill assessment, and autonomous assistance. However, robust performance under real-world conditions remains a significant challenge. Incorporating surgical context – such as the current procedural phase – has emerged as a promising strategy to improve robustness and interpretability. To address these challenges, we organized the Surgical Procedure Phase, Keypoint, and Instrument Recognition (PhaKIR) sub-challenge as part of the Endoscopic Vision (EndoVis) challenge at MICCAI 2024. We introduced a novel, multi-center dataset comprising thirteen full-length laparoscopic cholecystectomy videos collected from three distinct medical institutions, with unified annotations for three interrelated tasks: surgical phase recognition, instrument keypoint estimation, and instrument instance segmentation. Unlike existing datasets, ours enables joint investigation of instrument localization and procedural context within the same data while supporting the integration of temporal information across entire procedures. We report results and findings in accordance with the BIAS guidelines for biomedical image analysis challenges. The PhaKIR sub-challenge advances the field by providing a unique benchmark for developing temporally aware, context-driven methods in RAMIS and offers a high-quality resource to support future research in surgical scene understanding.-
dc.languageEnglish-
dc.publisherElsevier BV-
dc.titleComparative validation of surgical phase recognition, instrument keypoint estimation, and instrument instance segmentation in endoscopy: Results of the PhaKIR 2024 challenge-
dc.typeArticle-
dc.identifier.doi10.1016/j.media.2026.103945-
dc.description.journalClass1-
dc.identifier.bibliographicCitationMedical Image Analysis, v.109-
dc.citation.titleMedical Image Analysis-
dc.citation.volume109-
dc.description.isOpenAccessY-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.identifier.wosid001692802700001-
dc.identifier.scopusid2-s2.0-105027941376-
dc.relation.journalWebOfScienceCategoryComputer Science, Artificial Intelligence-
dc.relation.journalWebOfScienceCategoryComputer Science, Interdisciplinary Applications-
dc.relation.journalWebOfScienceCategoryEngineering, Biomedical-
dc.relation.journalWebOfScienceCategoryRadiology, Nuclear Medicine & Medical Imaging-
dc.relation.journalResearchAreaComputer Science-
dc.relation.journalResearchAreaEngineering-
dc.relation.journalResearchAreaRadiology, Nuclear Medicine & Medical Imaging-
dc.type.docTypeArticle-
dc.subject.keywordPlusAUGMENTED REALITY-
dc.subject.keywordAuthorInstrument keypoint estimation-
dc.subject.keywordAuthorInstrument instance segmentation-
dc.subject.keywordAuthorRobot-assisted surgery-
dc.subject.keywordAuthorSurgical phase recognition-
Appears in Collections:
KIST Article > 2026
Export
RIS (EndNote)
XLS (Excel)
XML

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE