高级检索
当前位置: 首页 > 详情页

BP-Net: Boundary and perfusion feature guided dual-modality ultrasound video analysis network for fibrous cap integrity assessment

文献详情

资源类型:
WOS体系:
Pubmed体系:

收录情况: ◇ SCIE

机构: [1]School of Information Science and Technology, Fudan University, Shanghai, China. [2]Department of Ultrasound, Tongren Hospital, Shanghai Jiao Tong University, Shanghai, China.
出处:
ISSN:

关键词: Carotid plaque Fibrous cap integrity Ultrasound videos classification Transformer Deep learning

摘要:
Ultrasonography is one of the main imaging methods for monitoring and diagnosing atherosclerosis due to its non-invasiveness and low-cost. Automatic differentiation of carotid plaque fibrous cap integrity by using multi-modal ultrasound videos has significant diagnostic and prognostic value for cardiovascular and cerebrovascular disease patients. However, the task faces several challenges, including high variation in plaque location and shape, the absence of analysis mechanism focusing on fibrous cap, the lack of effective mechanism to capture the relevance among multi-modal data for feature fusion and selection, etc. To overcome these challenges, we propose a new target boundary and perfusion feature guided video analysis network (BP-Net) based on conventional B-mode ultrasound and contrast-enhanced ultrasound videos for assessing the integrity of fibrous cap. Based on our previously proposed plaque auto-tracking network, in our BP-Net, we further introduce the plaque edge attention module and reverse mechanism to focus the dual video analysis on the fiber cap of plaques. Moreover, to fully explore the rich information on the fibrous cap and inside/outside of the plaque, we propose a feature fusion module for B-mode and contrast video to filter out the most valuable features for fibrous cap integrity assessment. Finally, multi-head convolution attention is proposed and embedded into transformer-based network, which captures semantic features and global context information to obtain accurate evaluation of fibrous caps integrity. The experimental results demonstrate that the proposed method has high accuracy and generalizability with an accuracy of 92.35% and an AUC of 0.935, which outperforms than the state-of-the-art deep learning based methods. A series of comprehensive ablation studies suggest the effectiveness of each proposed component and show great potential in clinical application.Copyright © 2023. Published by Elsevier Ltd.

基金:
语种:
被引次数:
WOS:
PubmedID:
中科院(CAS)分区:
出版当年[2022]版:
大类 | 2 区 工程技术
小类 | 2 区 工程:生物医学 2 区 核医学
最新[2023]版:
大类 | 2 区 医学
小类 | 2 区 工程:生物医学 2 区 核医学
JCR分区:
出版当年[2021]版:
Q1 ENGINEERING, BIOMEDICAL Q1 RADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING
最新[2023]版:
Q1 ENGINEERING, BIOMEDICAL Q1 RADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING

影响因子: 最新[2023版] 最新五年平均 出版当年[2021版] 出版当年五年平均 出版前一年[2020版] 出版后一年[2022版]

第一作者:
第一作者机构: [1]School of Information Science and Technology, Fudan University, Shanghai, China.
共同第一作者:
通讯作者:
推荐引用方式(GB/T 7714):
APA:
MLA:

资源点击量:21169 今日访问量:0 总访问量:1219 更新日期:2025-01-01 建议使用谷歌、火狐浏览器 常见问题

版权所有©2020 首都医科大学附属北京同仁医院 技术支持:重庆聚合科技有限公司 地址:北京市东城区东交民巷1号(100730)