Background and Objective: Deep anterior lamellar keratoplasty (DALK) is a widely used treatment for eye diseases and requires accurate and even stitch positions during the suturing process. In this regard, the utilization of Augmented Reality (AR) navigation systems shows promising potential in enhancing the stitching process, and a clear and unoccluded view of the corneal regions would help surgeons better plan the stitching positions. Methods: In this work, we present a joint-learning and iterative network for AR-based suturing navigation. This network aims to improve the performance of the inpainting under serious occlusion in the suturing process. And it can provide both original instruments and inpainted corneal masks along with inpainted frames. The network is based on feature reuse, iterative modules, and mask propagation structures to greatly reduce the computational cost. For the requirement of end-to-end training, we also propose a novel dataset synthesis method to construct a dataset with both occluded and unoccluded image pairs, along with mask and optical flow annotations. We also develop a novel pipeline based on the grid propagation method and inpainted optical flow outputs to provide clear and stable inpainted frames. Results: Based on the synthetic datasets, compared to the recent outstanding inpainting networks, our framework reaches a better trade-off between performance and computation efficiency. Our Iter-S model finally gets a mean endpoint error (mEPE) of 1.69, a peak signal-to-noise ratio (PSNR) of 36.86, and a structure similarity index measure (SSIM) of 0.976, along with a low inpainting inference time of 16.26ms. Based on the Iter-S, we construct a novel AR navigation system with a frame rate of around 35.14ms/28FPS on average. Conclusions: The iterative modules can progressively refine the outputs while providing a favorable trade-off between visual performance and real-time computation efficiency based on the selection of iteration times. Our AR navigation framework can provide stable and accurate tracking outputs with well-inpainted results in real time under severe occlusion conditions, which demonstrates the benefits of guiding the stitching operations of surgeons in corneal surgeries.
基金:
National Natural Science Foundation of China [62272017, 62172437]; Beijing Natural Science Foundation, China [L232065, L232135]; Beijing Municipal Science & Technology Commission, China [Z221100007422005]
语种:
外文
WOS:
PubmedID:
中科院(CAS)分区:
出版当年[2025]版:
大类|2 区医学
小类|2 区计算机:跨学科应用2 区计算机:理论方法2 区工程:生物医学3 区医学:信息
最新[2025]版:
大类|2 区医学
小类|2 区计算机:跨学科应用2 区计算机:理论方法2 区工程:生物医学3 区医学:信息
JCR分区:
出版当年[2023]版:
Q1COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONSQ1COMPUTER SCIENCE, THEORY & METHODSQ1ENGINEERING, BIOMEDICALQ1MEDICAL INFORMATICS
最新[2024]版:
Q1COMPUTER SCIENCE, THEORY & METHODSQ1MEDICAL INFORMATICSQ2COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONSQ2ENGINEERING, BIOMEDICAL
第一作者机构:[1]Beihang Univ, Beijing Adv Innovat Ctr Biomed Engn, State Key Lab Virtual Real Technol & Syst, Beijing 100191, Peoples R China
通讯作者:
推荐引用方式(GB/T 7714):
Liu Weimin,Pan Junjun,Jia Liyun,et al.Real-time tracking and inpainting network with joint learning iterative modules for AR-based DALK surgical navigation[J].COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE.2025,272:doi:10.1016/j.cmpb.2025.109068.
APA:
Liu, Weimin,Pan, Junjun,Jia, Liyun,Rao, Sijing&Zang, Jie.(2025).Real-time tracking and inpainting network with joint learning iterative modules for AR-based DALK surgical navigation.COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE,272,
MLA:
Liu, Weimin,et al."Real-time tracking and inpainting network with joint learning iterative modules for AR-based DALK surgical navigation".COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 272.(2025)