高级检索
当前位置: 首页 > 详情页

Domain knowledge-guided adversarial adaptive fusion of hybrid breast ultrasound data

文献详情

资源类型:
Pubmed体系:
机构: [1]School of Computing and Artificial Intelligence, Southwest Jiaotong University, Chengdu, 610031, Sichuan, China [2]Tangshan Research Institute, Southwest Jiaotong University, Tangshan, 063002, Hebei, China [3]Third People’s Hospital of Chengdu, Affiliated Hospital of Southwest Jiaotong University, Chengdu, 610031, Sichuan, China [4]Engineering Research Center of Sustainable Urban Intelligent Transportation, Ministry of Education, China [5]Manufacturing Industry Chains Collaboration and Information Support Technology Key Laboratory of Sichuan Province, Chengdu, 610031, Sichuan, China
出处:
ISSN:

关键词: Breast cancer diagnosis Contrast-enhanced ultrasound B-mode ultrasound Time intensity curve Hybrid modal adaptive fusion

摘要:
Contrast-enhanced ultrasound (CEUS), which provides more detailed microvascular information about the tumor, is always taken by radiologists in clinic diagnosis along with B-mode ultrasound (B-mode US). However, automatically analyzing breast CEUS is challenging due to the difference between the CEUS video and the natural video, e.g., sports or action videos, where the CEUS video has no positional displacements. Additionally, most existing methods rarely use the Time Intensity Curve (TIC) information of CEUS and non-imaging clinical (NIC) data. To address these issues, we propose a novel breast cancer diagnosis framework that learns the complementarity and correlation across hybrid modal data, including CEUS, B-mode US, and NIC data, by an adversarial adaptive fusion method. Furthermore, to fully exploit the CEUS information, the proposed method, inspired by the clinical processing of radiologists, first extracts the TIC parameters of CEUS. Then, we select a clip from CEUS using a frame screening strategy and finally get spatio-temporal features from these clips through a critical frame attention network. To our knowledge, this is the first AI system to use TIC parameters, NIC data, and ultrasound imaging in diagnoses. We have validated our method on a dataset collected from 554 patients. The experimental results demonstrate the excellent performance of the proposed method. The result shows that our method can achieve an accuracy of 87.73%, which is higher than that of uni-modal approaches by nearly 5%.Copyright © 2023 Elsevier Ltd. All rights reserved.

基金:
语种:
PubmedID:
中科院(CAS)分区:
出版当年[2023]版:
大类 | 2 区 医学
小类 | 1 区 生物学 1 区 数学与计算生物学 2 区 计算机:跨学科应用 2 区 工程:生物医学
最新[2023]版:
大类 | 2 区 医学
小类 | 1 区 生物学 1 区 数学与计算生物学 2 区 计算机:跨学科应用 2 区 工程:生物医学
第一作者:
第一作者机构: [1]School of Computing and Artificial Intelligence, Southwest Jiaotong University, Chengdu, 610031, Sichuan, China [4]Engineering Research Center of Sustainable Urban Intelligent Transportation, Ministry of Education, China [5]Manufacturing Industry Chains Collaboration and Information Support Technology Key Laboratory of Sichuan Province, Chengdu, 610031, Sichuan, China
通讯作者:
通讯机构: [1]School of Computing and Artificial Intelligence, Southwest Jiaotong University, Chengdu, 610031, Sichuan, China [4]Engineering Research Center of Sustainable Urban Intelligent Transportation, Ministry of Education, China [5]Manufacturing Industry Chains Collaboration and Information Support Technology Key Laboratory of Sichuan Province, Chengdu, 610031, Sichuan, China
推荐引用方式(GB/T 7714):
APA:
MLA:

资源点击量:43389 今日访问量:0 总访问量:3120 更新日期:2024-09-01 建议使用谷歌、火狐浏览器 常见问题

版权所有©2020 四川省肿瘤医院 技术支持:重庆聚合科技有限公司 地址:成都市人民南路四段55号