Advanced Ultrasound in Diagnosis and Therapy ›› 2023, Vol. 7 ›› Issue (2): 91-113.doi: 10.37015/AUDT.2023.230012
• Review Articles • Previous Articles Next Articles
Changyan Wang, BSa,b, Haobo Chen, MSa,b, Jieyi Liu, BSa,b, Changchun Li, BSa,b, Weiwei Jiao, BSa,b, Qihui Guo, BSa,b, Qi Zhang, PhDa,b,*()
Received:
2023-03-28
Revised:
2023-04-07
Accepted:
2023-04-22
Online:
2023-06-30
Published:
2023-04-27
Contact:
Qi Zhang, PhD,
E-mail:zhangq@t.shu.edu.cn
Changyan Wang, BS, Haobo Chen, MS, Jieyi Liu, BS, Changchun Li, BS, Weiwei Jiao, BS, Qihui Guo, BS, Qi Zhang, PhD. Deep Learning on Ultrasound Imaging for Breast Cancer Diagnosis and Treatment: Current Applications and Future Perspectives. Advanced Ultrasound in Diagnosis and Therapy, 2023, 7(2): 91-113.
Table 1
Deep learning for despeckling in ultrasound"
References | Datasets | Tasks | DL Models | Results |
---|---|---|---|---|
Latif et al., 2020 [ | 100 benign and 150 malignant | Despeckling and classification of benign and malignant breast tumors | CNN | Denoising enhanced the classifier’s accuracy from 84% to 88% |
Priyanka et al., 2020 [ | 500 images (kidney and breast) | Despeckling | Residual learning network | Naturalness image quality evaluator = 4.19 |
Feng et al., 2020 [ | 676 breast, 157 liver and 100 spinal US images | Despeckling | VGGNet | Naturalness image quality evaluator = 8.65 |
Hee et al., 2022 [ | 7 patients | Despeckling | Wavelet-based GAN | Equivalent number of looks = 190.87; Contrast-to-Noise Ratio = 31.68 |
Table 2
Deep learning for breast tumor localization in ultrasound"
Ultrasound Modalities | References | Datasets | Tasks | DL Models | Results |
---|---|---|---|---|---|
B-mode | Liu et al., 2022 [ | 1603 cases with 17321 images | Detection | CNN | Sensitivity = 0.797 Specificity = 0.962 F1-score = 0.742 |
B-mode | Wang et al., 2022 [ | Dataset 1: 163 images (109 benign and 54 malignant) Dataset 2: 780 images (437 benign, 210 malignant and 133 normal) Dataset 3: 562 images | Detection | An anchor-free network | Benign: Precision = 0.917 Recall = 0.980 Malignant: Precision= 0.888 Recall = 0.963 |
B-mode | Meng et al., 2023 [ | 3759 patients with 7040 images | Detection | Dual global attention neural network | Mean average precision = 0.836 |
ABUS | Wang et al., 2020 [ | 194 patients with 293 tumors and 70 patients without diseases | Detection | CNN | Sensitivity = 0.91 False positives per slice = 1.92 |
ABUS | Zhang et al., 2021 [ | 70 ABUS tumor volumes from 124 female patients | Detection | Bayesian YOLO based on Monte Carlo dropout | Sensitivity = 0.88 False positives per slice = 0.19 |
ABUS | Zhang et al. 2022 [ | 741 cases with 2538 volumes | Detection | YOLO | Detection rate = 0.78 |
ABUS | Malekmohammadi et al., 2023 [ | 60 volumes from 43 patients | Detection | Bi-ConvLSTM | Sensitivity = 82% |
B-mode | Vakanski et al., 2020 [ | 510 images | Segmentation | U-Net with attention blocks | DSC = 0.91 JI = 0.84 |
B-mode | Yan et al., 2022 [ | 316 images (154 malignant and 162 benign) | Segmentation | Attention enhanced U-Net with hybrid dilated convolution | Mean IoU = 0.82 Accuracy = 0.96 Recall = 0.80 |
B-mode | Zhai et al., 2022 [ | Dataset 1: 647 cases Dataset 2: 200 cases Dataset 3: 320 cases Dataset 4: 1805 cases | Segmentation | Asymmetric semi-supervised GAN | IoU = 0.68 Accuracy = 0.95 DSC = 0.81 |
B-mode | Punn et al., 2022 [ | Dataset 1: 780 images (437 benign, 210 malignant and 133 normal) Dataset 2: 562 images | Segmentation | Residual cross spatial attention guided inception U-Net | DSC = 0.935 Mean IoU = 0.904 |
B-mode | Chen et al., 2022 [ | 780 images (437 benign, 210 malignant and 133 normal) | Segmentation | Bidirectional aware guidance network | Accuracy = 0.93 JI = 0.60 Precision = 0.76 Recall = 0.77 Specificity = 0.96 DSC = 0.70 |
B-mode | Iqbal et al., 2023 [ | Dataset 1: 811 images Dataset 2: 42 images | Detection | Swin Transformer | F1 score = 0.771 JI = 0.597 |
B-mode | Yang et al. 2023 [ | Dataset 1: 163 images Dataset 2: 780 images | Segmentation | CSwin-PNet | Dataset 1: IoU = 0.786; DSC = 0.872 Dataset 2: IoU = 0.751; DSC =0.837 |
B-mode | Chen et al., 2023 [ | Dataset 1: 133 normal, 437 benign, and 210 malignant; Dataset 2: 110 benign and 53 malignant | Segmentation | Refinement residual convolutional network | JI = 0.72 Precision = 0.79 Recall = 0.83 Specificity = 0.98 DSC = 0.79 |
ABUS | Cao et al., 2020 [ | 107 patients with 170 volumes | Segmentation | Uncertainty aware temporal-ensembling | JI = 0.64; DSC= 0.74; Hausdorff distance = 3.81mm |
Table 3
Deep learning for breast cancer diagnosis in ultrasound"
Ultrasound Modalities | References | Datasets | Tasks | DL Models | Results |
---|---|---|---|---|---|
B-mode | Han et al., 2017 [ | 5151cases with 7408 images (4254 benign and 3154 malignant) | Classification of benign and malignant breast tumors | GoogLeNet | AUC = 0.9 Accuracy = 0.9 Sensitivity = 0.86 Specificity = 0.96 |
B-mode & color Doppler | Shen et al., 2021 [ | 5,442,907 B-mode and color Doppler images | Identify breast cancer | Deep neural network with attention module | AUC = 0.962 |
Color Doppler | Qi et al., 2019 [ | 8000 images from 2047 patients | Task1: Differentiate non-malignant and malignant Task2: Recognize solid nodules | CNN with multi-scale kernels and skip connections | Task1: Accuracy = 0.945 Sensitivity = 0.957 Specificity = 0.939 Task2: Accuracy = 0.901 Sensitivity = 0.935 Specificity = 0.832 |
B-mode | Cao et al., 2020 [ | 1,976 images (1890 contain one lesion, 80 contain two lesions, and 6 contain more than two lesions) | Classification of benign and malignant breast tumors | Noise filter network | Accuracy = 0.73 Precision = 0.69 Recall = 0.80 F1-score = 0.74 |
B-mode | Cao et al., 2021 [ | 935 images (473 benign and 462 malignant) | Classification of benign and malignant breast tumors | AlexNet | Accuracy = 75.8% Precision = 73.0% Recall = 80.1% F1-score = 0.764 |
B-mode | Huang et al., 2019 [ | 531 cases of Category 3, 443 cases of Category 4A, 376 cases of Category 4B, 565 cases of Category 4C, and 323 cases of Category 5 | Evaluate breast tumors into five categories (“3”, “4A”, “4B”, “4C”, “5”) | CNN | Accuracy: 0.998 (“3”); 0.940 (“4A”); 0.734 (“4B”) 0.922 (“4C”); 0.876 (“5”) |
B-mode & SWE | Li et al., 2021 [ | 599 images from 91 patients (64 benign and 27 malignant) | Classification of benign and malignant breast tumors | DenseNet [ | AUC = 0.866 |
B-mode & CEUS | Yang et al., 2020 [ | 268 samples (146 malignant and 122 benign) | Identify breast cancer | Temporal sequence dual-branch network | Accuracy = 90.2% |
B-mode & CEUS | Chen et al., 2021 [ | 221 breast lesions in 217 patients | Identify breast cancer | 3D CNN | Accuracy = 86.3% Sensitivity = 97.2% |
B-mode & CEUS | Gong et al., 2022 [ | 1070 tumors (507 malignant and 563 benign) | Classification of benign and malignant breast tumors | ResNet | AUC = 0.925 Accuracy = 0.897 |
B-mode & EUS | Yao et al., 2023 [ | 4580 cases (2226 malignant and 2354 benign) | Classification of benign and malignant breast tumors | GAN | AUC: 0.751 (real EUS) 0.767 (virtual EUS) |
B-mode | Wang et al., 2018 [ | 650 US images | Segmentation and classification of benign and malignant breast tumors | Multi-feature guided CNN | Recall = 0.97 F-score = 0.968 Precision = 0.965 |
B-mode | Xie et al., 2018 [ | 1418 normal and 1182 cancerous samples | Segmentation and diagnosis of the breast cancer | ResNet and Mask R-CNN | Precision = 98.72% Recall = 98.05% |
B-mode | Singh et al., 2019 [ | 150 malignant and 100 benign tumors | Segmentation and classification of benign and malignant breast tumors | cGAN | Accuracy = 85% Precision = 81% Recall = 92% |
B-mode | Luo et al., 2022 [ | Dataset 1: 292 images (160 benign and 132 malignant) Dataset 2: 1702 images (786 benign and 916 malignant) | Classification of benign and malignant breast tumors | DCNN | Accuracy = 90.78% Sensitivity = 91.18% Specificity = 90.44% F1-score = 91.46% AUC = 0.9549 |
B-mode | Xiao et al., 2018 [ | 2058 breast masses from 1422 patients (1370 benign and 688 malignant) | Discriminating benign cysts from malignant masses | Inception, ResNet, and Xception | Accuracy = 89.44% AUC = 93% |
B-mode | Gheflati et al., 2022 [ | Dataset 1: (133 normal, 437 malignant, and 210 benign) Dataset 2: (110 benign and 53 malignant) | Classification of benign and malignant breast tumors | ViT | Accuracy = 86.7% AUC = 95% |
B-mode & strain elastography | Misra et al., 2022 [ | 85 patients (42 benign and 43 malignant (80% for training and 20% for validation) | Classification of benign and malignant breast tumors | AlexNet and ResNet | Accuracy = 90% |
ABUS | Xiang et al., 2021 [ | 396 patients with 444 tumors (226 malignant and 218 benign) | Classification of benign and malignant breast tumors | U-Net and residual-capsule neural network | Accuracy = 84.9% Sensitivity = 87.2% Specificity = 82.6% AUC = 0.9122 |
ABUS | Zhuang et al., 2021 [ | 214 ABUS sequences (86 malignant and 128 benign) | Classification of benign and malignant breast tumors | Shallowly dilated convolutional branch network and VGG | Accuracy = 0.9286 Recall = 0.8824 Precision = 0.9375 F1-score = 0.9091 AUC = 0.9721 |
ABUS | Malekmohammadi et al., 2023 [ | 60 volumes from 43 patients, including 42 malignant and 13 benign | Classification of benign and malignant breast tumors | Convolutional BiLSTM | Precision = 84% Recall = 84% Accuracy = 93% F1-score = 84% AUC = 97% |
ABUS | Zhou et al., 2021 [ | 170 volumes from 107 patients | Tumor classification and segmentation | CNN | Accuracy = 0.741 Precision = 0.826 F1-score = 0.811 |
B-mode | Gu et al., 2023 [ | 396 patients (212 LNM, 184 non-LNM, 134 pCR, 262 non-pCR); 88 patients (255 LNM, 33 non-LNM, 134 pCR, 262 non-pCR) | Predict the axillary LNM after NAC | DLR nomogram | AUC = 0.863 Specificity = 0.818 negative predictive value = 0.872 |
B-mode | Guo et al., 2020 [ | 3049 images from 937 patients | Predict the risk of SLN and non-SLN metastasis | DLR | Sensitivity: 0.984 (SLNs) 0.984 (non-SLNs) |
B-mode | Ozaki et al., 2022 [ | 628 images from 253 patients | Distinguish metastatic ALNs | CNN | AUC = 0.966, Sensitivity = 0.94 Specificity = 0.88 |
B-mode | Coronado-Gutierrez et al. 2019 [ | 118 images from105 patients | Diagnose ALN involvement | Fisher vector CNN | Accuracy = 0.864 Sensitivity = 0.849 Specificity = 0.877 |
B-mode | Sun et al., 2022 [ | 338 images from 169 patients | Predict axillary LNM | CNN | AUC = 0.72 Accuracy = 0.726 Sensitivity = 0.655 Specificity = 0.789 |
B-mode | Zhou et al., 2020 [ | 974 imaging from 756 patients and 81 imaging from 78 patients | Predict clinically negative axillary LNM | CNN (Inception) | AUC = 0.89 Sensitivity = 0.85 Specificity = 0.73 |
B-mode & SWE | Zheng et al., 2020 [ | 584 patients with 584 malignant breast lesions | Predict ALN status preoperatively in patients with early-stage breast cancer | DLR based on ResNet | AUC = 0.902 Sensitivity = 0.816 |
B-mode | Lee et al., 2020 [ | 153 malignant tumor images (59 axillary LNM and 94 non-metastasis) | Predict the axillary LNM status using breast cancer US | Mask R-CNN and CNN | Accuracy = 0.771 Sensitivity = 0.661 Specificity = 0.840 AUC = 0.759 |
B-mode | Sun et al., 2020 [ | 2395 images from 479 patients | Predict axillary LNM | DenseNet [ | AUC = 0.95 |
B-mode | Jiang et al., 2021 [ | 4828 images from 1275 patients | Assessment of breast cancer molecular subtypes | DCNN | Accuracy ranging from 87.94% to 98.83% for each sub-category. Positive predictive value is 93.29% in discriminating luminal disease from non-luminal disease |
B-mode | Boulenger et al., 2023 [ | 831 images from 145 patients | Identification of triple-negative breast cancer | VGG | AUC = 0.86 Accuracy = 85% Sensitivity = 86% Specificity = 86% F1-score = 0.74 |
B-mode | Zhang et al., 2021 [ | 2,822, 917 images | Predict molecular subtypes | DCNN | AUC: 0.864 (triple-negative subtype); 0.811 (HER2 (+) subtype); 0.837 (HR (+) subtype) |
B-mode | Li et al., 2022 [ | Dataset 1: 1,012 breast cancer patients with 2,284 images; Dataset 2: 117 breast cancer patients with 153 images | Classify molecular subtypes of breast cancer | DCNN | Classify luminal A from non-luminal A: AUC = 0.818 (Dataset 1); AUC = 0.686 (Dataset 2); Detect triple-negative breast cancer: AUC = 0.712 (Dataset 1) |
Table 4
Deep learning for breast cancer response evaluation and outcome prediction in ultrasound"
Ultrasound Modalities | References | Datasets | Tasks | DL Models | Results |
---|---|---|---|---|---|
B-mode | Gu et al., 2021 [ | 168 women with 168 lesions | Classification of non-responding and responding patients | DLR | DLR-2: AUC = 0.812; DLR-4: AUC = 0.937; Specificity = 0.905; |
B-mode | Wu et al., 2022 [ | 801 patients | Tumor segmentation and classification of pathological complete response | U-Net | AUC = 0.927 |
B-mode | Taleghamar et al., 2022 [ | 181 patients | Classification of non-responding and responding patients | ResNet | Accuracy = 0.88 AUC = 0.86 |
B-mode & SWE | Zhang et al., 2020 [ | 584 women with 584 malignant breast lesions | Classification of ALN status | ResNet | Disease-free axilla vs. axillary metastasis: AUC = 0.902 low vs. heavy metastatic burden of axillary disease: AUC = 0.905 |
B-mode | Zhou et al., 2020 [ | Dataset 1: 756 patients with 974 images; Dataset 2: 78 patients with 81 images | Classification of ALN status | Inception-ResNet | AUC = 0.89 Sensitivity = 0.85 Specificity = 0.73 |
[1] | Breast cancer. World Health Organization. Available at: https://www.who.int/news-room/fact-sheets/detail/breast-cancer. (Accessed: 26 March 2021).[EB/OL]. |
[2] | Cancer fact sheets. World Health Organization. Available at: https://gco.iarc.fr/today/fact-sheets-cancers. (Accessed: 2020).[EB/OL]. |
[3] |
Ciritsis A, Rossi C, Eberhard M, Marcon M, Becker AS, Boss A. Automatic classification of ultrasound breast lesions using a deep convolutional neural network mimicking human decision-making. Eur Radiol 2019; 29:5458-5468.
doi: 10.1007/s00330-019-06118-7 pmid: 30927100 |
[4] | Benjamin L. Triche1 John T. Nelson Jr, Noah S. McGill, Kristin K. Porter, Rupan Sanyal, Franklin N. Tessler, et al. Recognizing and minimizing arti-facts at CT, MRI, US, and molecular imaging. Radiographics 2019; 39:1017-1018. |
[5] | Eng KA, Abadeh A, Ligocki C, Lee YK, Moineddin R, Adams-Webber T, et al. Acute appendicitis: a meta-analysis of the diagnostic accuracy of US, CT, and MRI as second-line imaging tests after an initial US. Radiology 2018; 288:717-727. |
[6] | Moriwaki Y, Otani J, Okuda J, Niwano T. Gallbladder torsion: US, CT, and MRI findings. J Gastrointest Surg 2019; 23:1077-1079. |
[7] |
Sarker IH. Deep learning: a comprehensive overview on techniques, taxonomy, applications and research directions. SN Comput Sci 2021; 2:420.
doi: 10.1007/s42979-021-00815-1 |
[8] |
Fujioka T, Kubota K, Mori M, Kikuchi Y, Katsuta L, Kasahara M, et al. Distinction between benign and malignant breast masses at breast ultrasound using deep learning method with convolutional neural network. Jpn J Radiol 2019; 37:466-472.
doi: 10.1007/s11604-019-00831-5 pmid: 30888570 |
[9] |
Tanaka H, Chiu SW, Watanabe T, Kaoku S, Yamaguchi T. Computer-aided diagnosis system for breast ultrasound images using deep learning. Phys Med Biol 2019; 64:235013.
doi: 10.1088/1361-6560/ab5093 |
[10] |
Cao Z, Duan L, Yang G, Yue T, Chen Q. An experimental study on breast lesion detection and classification from ultrasound images using deep learning architectures. BMC Med Imaging 2019; 19:51.
doi: 10.1186/s12880-019-0349-x pmid: 31262255 |
[11] |
Byra M, Galperin M, Ojeda-Fournier H, Olson L, O'Boyle M, Comstock C, et al. Breast mass classification in sonography with transfer learning using a deep convolutional neural network and color conversion. Med Phys 2019; 46:746-755.
doi: 10.1002/mp.13361 pmid: 30589947 |
[12] |
Park HJ, Kim SM, La Yun B, Jang M, Kim B, Jang JY, et al. A computer-aided diagnosis system using artificial intelligence for the diagnosis and characterization of breast masses on ultrasound: added value for the inexperienced breast radiologist. Medicine (Baltimore) 2019; 98:e14146.
doi: 10.1097/MD.0000000000014146 |
[13] |
Han S, Kang HK, Jeong JY, Park MH, Kim W, Bang WC, et al. A deep learning framework for supporting the classification of breast lesions in ultrasound images. Phys Med Biol 2017; 62:7714-7728.
doi: 10.1088/1361-6560/aa82ec pmid: 28753132 |
[14] | Aloysius N, Geetha M, Ieee. A review on deep convolutional neural networks. IEEE International Conference on Communication and Signal Processing (ICCSP) 2017:588-592. |
[15] |
Zhou DX. Theory of deep convolutional neural networks: downsampling. Neural Networks 2020; 124:319-327.
doi: 10.1016/j.neunet.2020.01.018 |
[16] |
Anwar SM, Majid M, Qayyum A, Awais M, Alnowami M, Khan MK. Medical image analysis using convolutional neural networks: a review. J Med Syst 2018; 42:226.
doi: 10.1007/s10916-018-1088-1 pmid: 30298337 |
[17] | Yadav SS, Jadhav SM. Deep convolutional neural network based medical image classification for disease diagnosis. Journal of Big Data 2019;6. |
[18] | He KM, Zhang XY, Ren SQ, Sun J. Deep residual learning for image recognition. IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 2016:770-778. |
[19] | Huang G, Liu Z, Van Der Maaten L, Weinberger KQ. connected convolutional networks. 30th IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2017:2261-2269. |
[20] | Howard, Andrew G, Menglong Zhu, Bo Chen, Dmitry Kalenichenko, Weijun Wang, et al. MobileNets: efficient convolutional neural networks for mobile vision applications. ArXiv.org 2017. |
[21] | Tan M, Le Q V. EfficientNet: rethinking model scaling for convolutional neural networks. 36th International Conference on Machine Learning (ICML) 2019. |
[22] | Girshick R, Donahue J, Darrell T, Malik J. Rich feature hierarchies for accurate object detection and semantic segmentation. 27th IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 2014:580-587. |
[23] | Redmon J, Divvala S, Girshick R, Farhadi A. You Only Look Once: unified, real-time object detection. 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 2016:779-788. |
[24] | Liu W, Anguelov D, Erhan D, Szegedy C, Reed S, Fu CY, et al. SSD: single shot multibox detector. 14th European Conference on Computer Vision (ECCV) 2016: 21-37. |
[25] | Tan M, Pang R, Le Q V. EfficientDet: scalable and efficient object detection. IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2020: 10778-10787. |
[26] | Zhou X, Wang D, Krähenbühl P. Objects as points. Cornell University Library, arXiv.org 2019. |
[27] | Ronneberger O, Fischer P, Brox T. U-Net: convolutional networks for biomedical image segmentation. 18th International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI) 2015: 234-241. |
[28] | Oktay, Ozan, Jo Schlemper, Loic Le Folgoc, Matthew Lee, Mattias Heinrich, et al. Attention U-Net: learning where to look for the pancreas. ArXiv.org 2018. |
[29] | Zhou Z, Siddiquee MMR, Tajbakhsh N, Liang J. UNet++: a nested U-Net architecture for medical image segmentation. Deep Learn Med Image Anal Multimodal Learn Clin Decis Support 2018; 11045:3-11. |
[30] | Chen LC, Zhu Y, Papandreou G, Schroff F, Hartwig A. Encoder-Decoder with atrous separable convolution for semantic image segmentation. ArXiv.org 2018. |
[31] |
Isensee F, Jaeger PF, Kohl SAA, Petersen J, Maier-Hein KH. nnU-Net: a self-configuring method for deep learning-based biomedical image segmentation. Nat Methods 2021; 18:203-211.
doi: 10.1038/s41592-020-01008-z pmid: 33288961 |
[32] |
Hochreiter S, Schmidhuber J. Long short-term memory. Neural computation 1997; 9:1735-1780.
doi: 10.1162/neco.1997.9.8.1735 pmid: 9377276 |
[33] | Chung JY, Gulcehre C, Cho K, Bengio Y. Empirical evaluation of gated recurrent neural networks on sequence modeling. ArXiv.org 2014. |
[34] | Chung J, Kastner K, Dinh L, Goel K, Courville A, Bengio Y. A recurrent latent variable model for sequential data. ArXiv.org 2016. |
[35] | Sutskever I, Vinyals O, Le Q V. Sequence to sequence learning with neural networks. 28th Conference on Neural Information Processing Systems (NIPS) 2014. |
[36] |
Huang R, Ying Q, Lin Z, Zheng Z, Tan L, Tang G, et al. Extracting keyframes of breast ultrasound video using deep reinforcement learning. Med Image Anal 2022; 80:102490.
doi: 10.1016/j.media.2022.102490 |
[37] | Schmiedt K, Simion G, Căleanu C D. Preliminary results on contrast enhanced ultrasound video stream diagnosis using deep neural architectures. International Symposium on Electronics and Telecommunications (ISETC) 2022:1-4. |
[38] | Goodfellow I, Pouget-Abadie J, Mirza M, Xu B, Warde-Farley D, Ozair S, et al. Generative adversarial networks. Advances in Neural Information Processing Systems 2014;3. |
[39] | Mirza M, Osindero S. Conditional generative adversarial nets. ArXiv abs 2014:1411-1784. |
[40] | Zhu JY, Park T, Isola P, A Efros Alexei. Unpaired image-to-image translation using cycle-consistent adversarial networks. IEEE International Conference on Computer Vision (ICCV) 2017:2242-2251. |
[41] | Arjovsky M, Chintala s, Bottou L. Wasserstein generative adversarial networks. In Proceedings of the 34th International Conference on Machine Learning (ICML'17) 2017; 70:214-223. |
[42] |
Karras T, Laine S, Aila T. A style-based generator architecture for generative adversarial networks. IEEE Trans Pattern Anal Mach Intell 2021; 43:4217-4228.
doi: 10.1109/TPAMI.2020.2970919 |
[43] |
Zhou T, Li Q, Lu H, Cheng Q, Zhang XX. GAN review: models and medical image fusion applications. Information Fusion 2023; 91:134-148
doi: 10.1016/j.inffus.2022.10.017 |
[44] |
Chen Y, Yang X-H, Wei Z, Heidari Ali Asghar, Zheng N, Li ZC, et al. Generative adversarial networks in medical image augmentation: a review. Computers in Biology and Medicine 2022; 144:105382.
doi: 10.1016/j.compbiomed.2022.105382 |
[45] |
Pang T, Wong J HD, Ng WL, Chan CS. Semi-supervised GAN-based radiomics model for data augmentation in breast ultrasound mass classification. Computer Methods and Programs in Biomedicine 2021; 203:106018.
doi: 10.1016/j.cmpb.2021.106018 |
[46] | Bentaieb A, Hamarneh G. Adversarial stain transfer for histopathology image analysis. TMI 2018; 37:792-802. |
[47] |
Yao Z, Luo T, Dong Y, Jia X, Deng Y, Wu G, et al. Virtual elastography ultrasound via generative adversarial network for breast cancer diagnosis. Nat Commun 2023; 14:788.
doi: 10.1038/s41467-023-36102-1 |
[48] |
Sagheer SVM, George SN. A review on medical image denoising algorithms. Biomedical Signal Processing and Control 2020; 61: 102036.
doi: 10.1016/j.bspc.2020.102036 |
[49] |
Zhou Z, Wang Y, Guo Y, Qi Y, Yu J. Image quality improvement of hand-held ultrasound devices with a two-stage generative adversarial network. IEEE Transactions on Biomedical Engineering 2020; 67: 298-311.
doi: 10.1109/TBME.10 |
[50] | Dosovitskiy A, Beyer L, Kolesnikov A, Weissenborn D, Zhai X, Unterthiner T, et al. An image is worth 16x16 words: transformers for image recognition at scale. ArXiv.org 2021. |
[51] | Touvron H, Cord M, Douze M, Massa F, Sablayrolles A, Jégou H. Training data-efficient image transformers & distillation through attention. ArXiv.org 2021. |
[52] | Chen C-F R, Fan Q, Panda R. CrossViT: cross-attention multi-scale vision transformer for image classification. IEEE/CVF International Conference on Computer Vision (ICCV) 2021: 347-356. |
[53] | Han K, Xiao A, Wu E, Guo J, Xu C, Wang Y. Transformer in transformer. ArXiv.org 2021. |
[54] | Liu Z, Lin Y, Cao Y, Hu H, Wei Y, Zhang Z, et al. Swin transformer: hierarchical vision transformer using shifted windows. IEEE/CVF International Conference on Computer Vision (ICCV) 2021: 9992-10002. |
[55] | Dong X, Bao J, Zhang W, Yu N, Yuan L, Chen D, et al. CSWin transformer: a general vision transformer backbone with cross-shaped windows. The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings 2022. |
[56] | Chen J, Lu Y, Yu Q, Luo X, Adeli E, Wang Y, et al. TransUNet: transformers make strong encoders for medical image segmentation. ArXiv.org 2021. |
[57] | Cao H, Wang Y, Chen J, Jiang D, Zhang X, Tian Q, et al. Swin-Unet: unet-like pure transformer for medical image segmentation. Computer Vision - ECCV 2022 Workshops. Cham: Springer Nature Switzerland 2023:205-218. |
[58] | Hong-Yu Z, Guo J, Zhang Y, Yu L, Wang L, Yu Y. NnFormer: interleaved transformer for volumetric segmentation. ArXiv.org 2022. |
[59] | Huang X, Deng Z, Li D, Yuan X. MISSFormer: an effective medical image segmentation transformer. ArXiv.org 2021. |
[60] | Wang J, Gou C, Wu Q, Feng H, Han J, Ding E, et al. RTFormer: efficient design for real-time semantic segmentation with transformer. ArXiv.org 2022. |
[61] | Li J, Zheng Q, Li M, Liu P, Wang Q, Sun LT. Rethinking breast lesion segmentation in ultrasound: a new video dataset and a baseline network. 25th International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI) 2022: 391-400. |
[62] | Lin Z, Huang R, Ni D, Wu JY, Luo BM. Masked video modeling with correlation-aware contrastive learning for breast cancer diagnosis in ultrasound. 1st International Workshop on Resource-Efficient Medical Image Analysis (REMIA) / 25th International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI) 2022:105-114. |
[63] | Latif G, Al Anezi F Y, Butt M O, Alghazo J. Ultrasound image despeckling and detection of breast cancer using deep CNN. RIVF International Conference on Computing and Communication Technologies (RIVF) 2020:210-214. |
[64] | Zhou Y, Chen H, Li Y, Cao XY, Wang S, Shen DG. Cross-Model attention-guided tumor segmentation for 3D automated breast ultrasound (ABUS) images. JBHI 2022; 26:301-311. |
[65] | Ilesanmi A E, Idowu O P, Chaumrattanakul U, Makhanov SS. Multiscale hybrid algorithm for pre-processing of ultrasound images. Biomedical Signal Processing and Control 66 2021:102396. |
[66] |
Duarte-Salazar CA, Eduardo Castro-Ospina A, Becerra MA, Delgado-Trejos E. Speckle noise reduction in ultrasound images for improving the metrological evaluation of biomedical applications: an overview. IEEE Access 2020; 8:15983-5999.
doi: 10.1109/Access.6287639 |
[67] |
Kokil P, Sudharson S. Despeckling of clinical ultrasound images using deep residual learning. Computer Methods and Programs in Biomedicine 2020; 194:105477.
doi: 10.1016/j.cmpb.2020.105477 |
[68] |
Feng X, Huang Q, Li X. Ultrasound image de-speckling by a hybrid deep network with transferred filtering and structural prior. Neurocomputing 2020; 414:346-355.
doi: 10.1016/j.neucom.2020.09.002 |
[69] |
Khor H G, Ning G, Zhang X, Liao H. Ultrasound speckle reduction using wavelet-based generative adversarial network. Ieee Journal of Biomedical and Health Informatics 2022; 26:3080-3091.
doi: 10.1109/JBHI.2022.3144628 |
[70] |
Zhang K, Zuo W, Chen Y, Meng D, Zhang L. Beyond a gaussian denoiser: residual learning of deep CNN for image denoising. IEEE Transactions on Image Processing 2017; 26:3142-3155.
doi: 10.1109/TIP.2017.2662206 pmid: 28166495 |
[71] |
Mittal A, Soundararajan R, Bovik A C. Making a “Completely Blind” image quality analyzer. Ieee Signal Processing Letters 2013; 20:209-212.
doi: 10.1109/LSP.2012.2227726 |
[72] | Li H, Wang M. Very deep convolutional network for large-scale image recognition. Jisuanji Xitong Yingyong = Computer Systems and Applications 2021:330. |
[73] | Dey R, Bhattacharjee D, Nasipuri M. Image denoising using generative adversarial network. Intelligent Computing: Image Processing Based Applications 2020:73-90. |
[74] |
Al-Dhabyani W, Gomaa M, Khaled H, Fahmy A. Dataset of breast ultrasound images. Data in Brief 2020; 28:104863-104863.
doi: 10.1016/j.dib.2019.104863 |
[75] |
Zhang Q, Han H, Ji C, Yu J, Wang Y, Wang W. Gabor-based anisotropic diffusion for speckle noise reduction in medical ultrasonography. Journal of the Optical Society of America. A, Optics, Image Science, and Vision 2014; 31:1273-283.
doi: 10.1364/JOSAA.31.001273 |
[76] |
Zhang Z, Li Y, Wu W, Chen H, Cheng L, Shu Wang. Tumor detection using deep learning method in automated breast ultrasound. Biomedical Signal Processing and Control 2021; 68:102677.
doi: 10.1016/j.bspc.2021.102677 |
[77] |
Jiang H, Diao Z, Shi T, Zhou Y, Wang F, Hu W, et al. A review of deep learning-based multiple-lesion recognition from medical images: classification, detection and segmentation. Computers in Biology and Medicine 2023; 157:106726.
doi: 10.1016/j.compbiomed.2023.106726 |
[78] | Quan MY, Huang YX, Wang CY, Zhang Q, Cai C, Zhou SC. Deep learning radiomics model based on breast ultrasound video to predict HER2 expression status. Frontiers in Endocrinology 2023. |
[79] |
Malekmohammadi A, Barekatrezaei S, Kozegar E, Soryani M. Mass detection in automated 3-D breast ultrasound using a patch Bi-ConvLSTM network. Ultrasonics 2023; 129:106891.
doi: 10.1016/j.ultras.2022.106891 |
[80] |
Meng H, Liu X, Niu J, Wang Y, Liao J, Li Q, et al. DGANet: a dual global attention neural network for breast lesion detection in ultrasound images. Ultrasound in medicine & biology 2023; 49:31-44.
doi: 10.1016/j.ultrasmedbio.2022.07.006 |
[81] | Liu G, Tan J, Yang H, Li Y, Sun X, Wu J, et al. Breast ultrasound tumor detection based on active learning and deep learning. Artificial Intelligence and Robotics 2022:1-10. |
[82] |
Wang Y, Yao Y. Breast lesion detection using an anchor-free network from ultrasound images with segmentation-based enhancement. Scientific Reports 2022; 12:14720.
doi: 10.1038/s41598-022-18747-y pmid: 36042216 |
[83] |
Zhang J, Tao X, Jiang Y, Wu X, Yan D, Xue W, et al. Application of convolution neural network algorithm based on multicenter ABUS images in breast lesion detection. Frontiers in Oncology 2022; 12: 938413.
doi: 10.3389/fonc.2022.938413 |
[84] |
Wang F, Liu X, Yuan N, Qian B, Ruan L, Yin C, et al. Study on automatic detection and classification of breast nodule using deep convolutional neural network system. Journal of Thoracic Disease 2020; 12:4690-4701.
doi: 10.21037/jtd-19-3013 pmid: 33145042 |
[85] |
Yan Y, Liu Y, Wu Y, Zhang H, Zhang Y, Meng L. Accurate segmentation of breast tumors using AE U-net with HDC model in ultrasound images. Biomedical Signal Processing and Control 2022; 72:103299.
doi: 10.1016/j.bspc.2021.103299 |
[86] |
Iqbal A, Sharif M. BTS-ST: Swin transformer network for segmentation and classification of multimodality breast cancer images. Knowledge-based systems 2023; 267:110393.
doi: 10.1016/j.knosys.2023.110393 |
[87] |
Yang H, Yang D. CSwin-PNet: A CNN-Swin transformer combined pyramid network for breast lesion segmentation in ultrasound images. Expert systems with applications 2023; 213:119024.
doi: 10.1016/j.eswa.2022.119024 |
[88] | Zhai D, Hu B, Gong X, Zou H, Luo J. ASS-GAN: asymmetric semi-supervised GAN for breast ultrasound image segmentation. Neurocomputing (Amsterdam) 2022; 493:204-216. |
[89] |
Cao X, Chen H, Li Y, Peng Y, Wang S, Cheng L. Uncertainty aware temporal-ensembling model for semi-supervised ABUS mass segmentation. IEEE Transactions on Medical Imaging 2021; 40:431-443.
doi: 10.1109/TMI.2020.3029161 pmid: 33021936 |
[90] |
Vakanski A, Xian M, Freer P E. Attention-Enriched deep learning model for breast tumor segmentation in ultrasound images. Ultrasound Med Biol 2020; 46:2819-2833.
doi: S0301-5629(20)30287-8 pmid: 32709519 |
[91] | Punn N S, Agarwal S. RCA-IUnet: a residual cross-spatial attention-guided inception U-Net model for tumor segmentation in breast ultrasound imaging. Machine Vision and Applications 2022;33. |
[92] |
Chen G, Dai Y, Zhang J. RRCNet: refinement residual convolutional network for breast ultrasound images segmentation. Engineering applications of artificial intelligence 2023; 117:105601.
doi: 10.1016/j.engappai.2022.105601 |
[93] | Chen G, Liu Y, Dai Y, Zhang J, Cui L, Yin X. BAGNet:bidirectional aware guidance network for malignant breast lesions segmentation. 7th Asia-Pacific Conference on Intelligent Robot Systems (ACIRS) 2022:112-116. |
[94] |
Mckinney S M, Sieniek M, Godbole V, Godwin J, Antropova N, Ashrafian H. International evaluation of an AI system for breast cancer screening. Nature 2020; 577:89-94.
doi: 10.1038/s41586-019-1799-6 |
[95] | Wang X, Moriakov N, Gao Y, Zhang T, Han L, Mann R.M. Artificial intelligence in breast imaging. Breast Imaging 2022:435-453. |
[96] |
Ahmed M, Douek M. Is axillary ultrasound imaging necessary for all patients with breast cancer? British Journal of Surgery 2018; 105: 930-932.
doi: 10.1002/bjs.10784 |
[97] | Sun Q, Lin X, Zhao Y, Li L, Yan K, Liang D, et al. Deep learning vs. radiomics for predicting axillary lymph node metastasis of breast cancer using ultrasound images: don't forget the peritumoral region. Front Oncol 2020:10-53. |
[98] |
Sun YS, Zhao Z, Yang ZN, Xu F, Lu HJ, Zhu ZY, et al. Risk factors and preventions of breast cancer. Int J Biol Sci 2017; 13:1387-1397.
doi: 10.7150/ijbs.21635 |
[99] |
Lei YM, Yin M, Yu MH, Yu J, Zeng SE, Lv WZ, et al. Artificial intelligence in medical imaging of the breast. Front Oncol 2021; 11: 600557.
doi: 10.3389/fonc.2021.600557 |
[100] |
Louro J, Posso M, Hilton Boon M, Román M, Domingo L, Castells X, et al. A systematic review and quality assessment of individualised breast cancer risk prediction models. Br J Cancer 2019; 121:76-85.
doi: 10.1038/s41416-019-0476-8 |
[101] |
Le EPV, Wang Y, Huang Y, Hickman S, Gilbert FJ. Artificial intelligence in breast imaging. Clin Radiol 2019; 74:357-366.
doi: S0009-9260(19)30116-3 pmid: 30898381 |
[102] |
Brentnall AR, Harkness EF, Astley SM, Donnelly LS, Stavrinos P, Sampson S, et al. Mammographic density adds accuracy to both the Tyrer-Cuzick and Gail breast cancer risk models in a prospective UK screening cohort. Breast Cancer Res 2015; 17:147.
doi: 10.1186/s13058-015-0653-5 pmid: 26627479 |
[103] | Li H, Giger ML, Huynh BQ, Antropova NO. Deep learning in breast cancer risk assessment: evaluation of convolutional neural networks on a clinical dataset of full-field digital mammograms. J Med Imaging (Bellingham) 2017; 4:041304. |
[104] |
Dembrower K, Liu Y, Azizpour H, Eklund M, Smith K, Lindholm P, et al. Comparison of a deep learning risk score and standard mammographic density score for breast cancer risk prediction. Radiology 2020; 294:265-272.
doi: 10.1148/radiol.2019190872 pmid: 31845842 |
[105] |
Yala A, Lehman C, Schuster T, Portnoi T, Barzilay R. A deep learning mammography-based model for improved breast cancer risk prediction. Radiology 2019; 292:60-66.
doi: 10.1148/radiol.2019182716 pmid: 31063083 |
[106] |
Qian X, Pei J, Zheng H, Xie X, Yan L, Zhang H, et al. Prospective assessment of breast cancer risk from multimodal multiview ultrasound images via clinically applicable deep learning. Nat Biomed Eng 2021; 5:522-532.
doi: 10.1038/s41551-021-00711-2 pmid: 33875840 |
[107] | Liu Y, Azizpour H, Strand F, Smith K. Decoupling inherent risk and early cancer signs in image-based breast cancer risk models. Medical Image Computing and Computer Assisted Intervention - MICCAI 2020:230-240. |
[108] |
Arefan D, Mohamed AA, Berg WA, Zuley ML, Sumkin JH, Wu S. Deep learning modeling using normal mammograms for predicting breast cancer risk. Med Phys 2020; 47:110-118.
doi: 10.1002/mp.13886 pmid: 31667873 |
[109] |
Kim J, Kim HJ, Kim C, Kim WH. Artificial intelligence in breast ultrasonography. Ultrasonography 2021; 40:183-190.
doi: 10.14366/usg.20117 pmid: 33430577 |
[110] |
Shen Y, Shamout FE, Oliver JR, Witowski J, Kannan K, Park J, et al. Artificial intelligence system reduces false-positive findings in the interpretation of breast ultrasound exams. Nat Commun 2021; 12: 5645.
doi: 10.1038/s41467-021-26023-2 pmid: 34561440 |
[111] |
Qi X, Zhang L, Chen Y, Pi Y, Chen Y, Lv Q, et al. Automated diagnosis of breast ultrasonography images using deep neural networks. Med Image Anal 2019; 52:185-198.
doi: S1361-8415(18)30683-2 pmid: 30594771 |
[112] |
Cao Z, Yang G, Chen Q, Chen X, Lv F. Breast tumor classification through learning from noisy labeled ultrasound images. Med Phys 2020; 47:1048-1057.
doi: 10.1002/mp.13966 pmid: 31837239 |
[113] |
Cao Z, Yang G, Li X, Chen Q, Wu J. Multitask classification method based on label correction for breast tumor ultrasound images. Neural Processing Letters 2021; 53:1453-1468.
doi: 10.1007/s11063-021-10455-4 |
[114] |
Huang Y, Han L, Dou H, Luo H, Yuan Z, Liu Q, et al. Two-stage CNNs for computerized BI-RADS categorization in breast ultrasound images. Biomed Eng Online 2019; 18:8.
doi: 10.1186/s12938-019-0626-5 pmid: 30678680 |
[115] |
Li C, Li J, Tan T, Chen K, Xu Y, Wu R. Application of ultrasonic dual-mode artificially intelligent architecture in assisting radiologists with different diagnostic levels on breast masses classification. Diagn Interv Radiol 2021; 27:315-322.
doi: 10.5152/dir.2021.20018 pmid: 34003119 |
[116] |
Yang Z, Gong X, Guo Y, Liu W. A temporal sequence dual-branch network for classifying hybrid ultrasound data of breast cancer. Ieee Access 2020; 8:82688-82699.
doi: 10.1109/Access.6287639 |
[117] |
Chen C, Wang Y, Niu J, Liu X, Li Q, Gong X. Domain knowledge powered deep learning for breast cancer diagnosis based on contrast-enhanced ultrasound videos. IEEE Trans Med Imaging 2021; 40: 2439-2451.
doi: 10.1109/TMI.2021.3078370 |
[118] |
Gong X, Zhao X, Fan L, Li T, Guo Y, Luo J. BUS-net: a bimodal ultrasound network for breast cancer diagnosis. International Journal of Machine Learning and Cybernetics 2022; 13:3311-3328.
doi: 10.1007/s13042-022-01596-6 |
[119] | Wang P, Patel V M, Hacihaliloglu I. Simultaneous segmentation and classification of bone surfaces from ultrasound using a multi-feature guided CNN. Medical Image Computing and Computer Assisted Intervention - MICCAI 2018:134-142. |
[120] | Xie X, Shi F, Niu J, Tang X. Breast ultrasound image classification and segmentation using convolutional neural networks. Advances in Multimedia Information Processing - PCM 2018:200-211. |
[121] | Singh V K, Rashwan H A, Abdel-Nasser M, Md Mostafa K S, Akram F, Pandey N, et al. An efficient solution for breast tumor segmentation and classification in ultrasound images using deep adversarial learning. ArXiv.org 2019. |
[122] |
Luo Y, Huang Q, Li X. Segmentation information with attention integration for classification of breast tumor in ultrasound image. Pattern Recognition 2022; 124:108427.
doi: 10.1016/j.patcog.2021.108427 |
[123] | Xiao T, Liu L, Li K, Qin W, Yu S, Li Z. Comparison of transferred deep neural networks in ultrasonic breast masses discrimination. Biomed Res Int 2018; 2018:4605191. |
[124] | Gheflati B, Rivaz H. Vision transformers for classification of breast ultrasound images. Annu Int Conf IEEE Eng Med Biol Soc 2022: 480-483. |
[125] |
Misra S, Jeon S, Managuli R, Lee S, Kim G, Yoon C, et al. Bi-Modal transfer learning for classifying breast cancers via combined B-Mode and ultrasound strain imaging. IEEE Trans Ultrason Ferroelectr Freq Control 2022; 69:222-232.
doi: 10.1109/TUFFC.2021.3119251 |
[126] |
Xiang H, Huang YS, Lee CH, Chang Chien TY, Lee CK, Liu L, et al. 3-D Res-CapsNet convolutional neural network on automated breast ultrasound tumor diagnosis. Eur J Radiol 2021; 138:109608.
doi: 10.1016/j.ejrad.2021.109608 |
[127] |
Zhuang Z, Ding W, Zhuang S, Joseph Raj AN, Wang J, Zhou W, et al. Tumor classification in automated breast ultrasound (ABUS) based on a modified extracting feature network. Comput Med Imaging Graph 2021; 90:101925.
doi: 10.1016/j.compmedimag.2021.101925 |
[128] |
Zhou Y, Chen H, Li Y, Liu Q, Xu X, Wang S, et al. Multi-task learning for segmentation and classification of tumors in 3D automated breast ultrasound images. Med Image Anal 2021; 70: 101918.
doi: 10.1016/j.media.2020.101918 |
[129] |
Gu J, Tong T, Xu D, Cheng F, Fang C, He C, et al. Deep learning radiomics of ultrasonography for comprehensively predicting tumor and axillary lymph node status after neoadjuvant chemotherapy in breast cancer patients: a multicenter study. Cancer 2023; 129: 356-366.
doi: 10.1002/cncr.v129.3 |
[130] |
Guo X, Liu Z, Sun C, Zhang L, Wang Y, Li Z, et al. Deep learning radiomics of ultrasonography: identifying the risk of axillary non-sentinel lymph node involvement in primary breast cancer. EBioMedicine 2020; 60:103018.
doi: 10.1016/j.ebiom.2020.103018 |
[131] |
Ozaki J, Fujioka T, Yamaga E, Hayashi A, Kujiraoka Y, Imokawa T, et al. Deep learning method with a convolutional neural network for image classification of normal and metastatic axillary lymph nodes on breast ultrasonography. Jpn J Radiol 2022; 40:814-822.
doi: 10.1007/s11604-022-01261-6 |
[132] |
Coronado-Gutiérrez D, Santamaría G, Ganau S, Bargalló X, Orlando S, Oliva-Brañas ME, et al. Quantitative ultrasound image analysis of axillary lymph nodes to diagnose metastatic involvement in breast cancer. Ultrasound Med Biol 2019; 45:2932-2941.
doi: S0301-5629(19)31122-6 pmid: 31444031 |
[133] |
Sun S, Mutasa S, Liu MZ, Nemer J, Sun M, Siddique M, et al. Deep learning prediction of axillary lymph node status using ultrasound images. Comput Biol Med 2022; 143:105250.
doi: 10.1016/j.compbiomed.2022.105250 |
[134] | Zhou LQ, Wu XL, Huang SY, Wu GG, Ye HR, Wei Q, et al. Lymph node metastasis prediction from primary breast cancer US images using deep learning. Radiology 2020; 294:19-28. |
[135] |
Zheng X, Yao Z, Huang Y, Yu Y, Wang Y, Liu Y, et al. Deep learning radiomics can predict axillary lymph node status in early-stage breast cancer. Nat Commun 2020; 11:1236.
doi: 10.1038/s41467-020-15027-z pmid: 32144248 |
[136] | Lee Y-W, Shih C-C, Chang R-F. Axillary lymph node metastasis status prediction in ultrasound image using convolution neural network. 15th International Workshop on Breast Imaging (IWBI) 2020. |
[137] | Jiang M, Zhang D, Tang SC, Luo XM, Chuan ZR, Lv WZ, et al. Deep learning with convolutional neural network in the assessment of breast cancer molecular subtypes based on US images: a multicenter retrospective study. Eur Radiol 2021; 31:3673-3682. |
[138] |
Boulenger A, Luo Y, Zhang C, Zhao C, Gao Y, Xiao M, et al. Deep learning-based system for automatic prediction of triple-negative breast cancer from ultrasound images. Med Biol Eng Comput 2023; 61:567-578.
doi: 10.1007/s11517-022-02728-4 |
[139] |
Zhang X, Li H, Wang C, Cheng W, Zhu Y, Li D, et al. Evaluating the accuracy of breast cancer and molecular subtype diagnosis by ultrasound image deep learning model. Front Oncol 2021; 11: 623506.
doi: 10.3389/fonc.2021.623506 |
[140] |
Li C, Huang H, Chen Y, Shao S, Chen J, Wu R, et al. Preoperative non-invasive prediction of breast cancer molecular subtypes with a deep convolutional neural network on ultrasound images. Front Oncol 2022; 12:848790.
doi: 10.3389/fonc.2022.848790 |
[141] | Du T, Bourdev L, Fergus R, Torresani L, Paluri M. Learning spatiotemporal features with 3D convolutional networks. IEEE International Conference on Computer Vision 2015:4489-4497. |
[142] |
He K, Gkioxari G, Dollar P, Girshick R. Mask R-CNN. IEEE Trans Pattern Anal Mach Intell 2020; 42:386-397.
doi: 10.1109/TPAMI.34 |
[143] | Yosinski J, Clune J, Bengio Y, Lipson H. How transferable are features in deep neural networks? ArXiv.org 2014. |
[144] | Szegedy C, Liu W, Jia Y, Sermanet P, Reed S, Anguelov D, et al. Going deeper with convolutions. IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 2015:1-9. |
[145] | Szegedy C, Vanhoucke V, Ioffe S, Shlens J, Wojna Z. Rethinking the inception architecture for computer vision. IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 2016:2818-826. |
[146] | Chollet F. Xception: deep learning with depthwise separable convolutions. The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings 2017: 1800. |
[147] |
Krizhevsky A, Sutskever I, Hinton G E. ImageNet classification with deep convolutional neural networks. Communications of the Acm 2017; 60:84-90.
doi: 10.1145/3065386 |
[148] |
Shin S Y, Lee S, Yun I D, Kim S M, Lee K M. Joint weakly and semi-supervised deep learning for localization and classification of masses in breast ultrasound images. IEEE Trans Med Imaging 2019; 38:762-774.
doi: 10.1109/TMI.2018.2872031 |
[149] | Kim C, Kim W H, Kim H J, Lee J H, Kim K W, Park Y M, et al. Weakly-supervised US breast tumor characterization and localization with a box convolution network. Conference on Medical Imaging - Computer-Aided Diagnosis 2020. |
[150] | Szegedy C, Liu W, Jia Y, Sermanet P, Reed S, Anguelov D, et al. Going deeper with convolutions. IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 2015:1-9. |
[151] | Milletari F, Navab N, Ahmadi S-A. V-Net: fully convolutional neural networks for volumetric medical image segmentation. 4th IEEE International Conference on 3D Vision (3DV) 2016:565-571. |
[152] |
Li J, Wang SR, Li QL, Zhu T, Zhu PS, Chen M, et al. Diagnostic value of multiple ultrasound diagnostic techniques for axillary lymph node metastases in breast cancer: a systematic analysis and network meta-analysis. Front Oncol 2023; 12:1043185.
doi: 10.3389/fonc.2022.1043185 |
[153] |
Chang JM, Leung JWT, Moy L, Ha SM, Moon WK. Axillary nodal evaluation in breast cancer: state of the art. Radiology 2020; 295:500-515.
doi: 10.1148/radiol.2020192534 pmid: 32315268 |
[154] | Brunetti N, Calabrese M, Martinoli C, Tagliafico AS. Artificial intelligence in breast ultrasound: from diagnosis to prognosis-a rapid review. Diagnostics (Basel) 2022; 13:58. |
[155] |
Gu J, Tong T, He C, Xu M, Yang X, Tian J, et al. Deep learning radiomics of ultrasonography can predict response to neoadjuvant chemotherapy in breast cancer at an early stage of treatment: a prospective study. Eur Radiol 2022; 32:2099-2109.
doi: 10.1007/s00330-021-08293-y |
[156] |
Wu L, Ye W, Liu Y, Chen D, Wang Y, Cui Y, et al. An integrated deep learning model for the prediction of pathological complete response to neoadjuvant chemotherapy with serial ultrasonography in breast cancer patients: a multicentre, retrospective study. Breast Cancer Res 2022; 24:81.
doi: 10.1186/s13058-022-01580-6 pmid: 36414984 |
[157] |
Taleghamar H, Jalalifar SA, Czarnota GJ, Sadeghi-Naini A. Deep learning of quantitative ultrasound multi-parametric images at pre-treatment to predict breast cancer response to chemotherapy. Sci Rep 2022; 12:2244.
doi: 10.1038/s41598-022-06100-2 pmid: 35145158 |
[158] |
Tjoa E, Guan C. A survey on explainable artificial intelligence (XAI): toward medical XAI. Ieee Transactions on Neural Networks and Learning Systems 2021; 32:4793-4813.
doi: 10.1109/TNNLS.2020.3027314 pmid: 33079674 |
[159] | Burrell J. How the machine ‘thinks’: understanding opacity in machine learning algorithms. Big Data & Society 2016; 3: 2053951715622512. |
[160] |
Ting F F, Tan Y J, Sim K S. Convolutional neural network improvement for breast cancer classification. Expert Systems with Applications 2019; 120:103-115.
doi: 10.1016/j.eswa.2018.11.008 |
[161] |
Lbachir I A, Daoudi I, Tallal S. Automatic computer-aided diagnosis system for mass detection and classification in mammography. Multimedia Tools and Applications 2021; 80:9493-9525.
doi: 10.1007/s11042-020-09991-3 |
[162] |
Cadario R, Longoni C, Morewedge CK. Understanding, explaining, and utilizing medical artificial intelligence. Nat Hum Behav 2021; 5: 1636-1642.
doi: 10.1038/s41562-021-01146-0 pmid: 34183800 |
[163] | Guan C, Wang X, Zhang Q, Chen R, He D, Xie X. Towards a deep and unified understanding of deep neural models in NLP. 36th International Conference on Machine Learning (ICML) 2019. |
[164] | Bang S, Xie P, Lee H, Wu W, Xing E. Explaining a black-box using deep variational information bottleneck approach. ArXiv.org 2019. |
[165] |
Shorten C, Khoshgoftaar T M. A survey on image data augmentation for deep learning. Journal of Big Data 2019; 6:1-48.
doi: 10.1186/s40537-018-0162-3 |
[166] |
Zhuang F, Qi Z, Duan K, Xi D, Zhu Y, Zhu H, et al. A comprehensive survey on transfer learning. Proceedings of the Ieee 2021; 109:43-76.
doi: 10.1109/PROC.5 |
[167] |
Mahoro E, Akhloufi M a A. Applying deep learning for breast cancer detection in radiology. Current Oncology 2022; 29:8767-8793.
doi: 10.3390/curroncol29110690 pmid: 36421343 |
[168] | Pourasad Y, Zarouri E, Salemizadeh Parizi M, Salih Mohammed A. Presentation of novel architecture for diagnosis and identifying breast cancer location based on ultrasound images using machine learning. Diagnostics (Basel) 2021; 11:1870. |
[169] |
Ding W, Wang J, Zhou W, Zhou S, Chang C, Shi J. Joint localization and classification of breast cancer in b-mode ultrasound imaging via collaborative learning with elastography. IEEE J Biomed Health Inform 2022; 26:4474-4485.
doi: 10.1109/JBHI.2022.3186933 |
[1] | Keyan Li, MD, Faqin Lv, MD, Junlai Li, MD. Clinical Application of Robot-assisted Teleultrasound [J]. Advanced Ultrasound in Diagnosis and Therapy, 2023, 7(3): 228-234. |
[2] | Leila Bayani, MD, Donya Goodarzi, BS, Reza Mardani, MD, Bita Eslami, PhD, Sadaf Alipour, MD. Localization of Nonpalpable Breast Lumps by Ultrasound Local Coordinates and Skin Inking: A Randomized Controlled Trial [J]. Advanced Ultrasound in Diagnosis and Therapy, 2023, 7(3): 267-271. |
[3] | Chang Liu, MD, Weiwei Shen, MD, Peng Fu, MD, Youchen Xia, MD, Jianxun Ma, MD, Ligang Cui, MD, Shi Tan, MD. Contrast-Enhanced Ultrasound in the Detection and Evaluation of Maxillofacial Arteriovenous Malformation: A Case Report [J]. Advanced Ultrasound in Diagnosis and Therapy, 2023, 7(3): 288-292. |
[4] | Huiyong Hu, MS, Hairong Wang, MS, Yunfeng Xu, MS. Spontaneous Remission of Pediatric Undescended Testis Torsion during Color Doppler Ultrasound Examination [J]. Advanced Ultrasound in Diagnosis and Therapy, 2023, 7(3): 296-298. |
[5] | Enze Qu, MD, Xinling Zhang, MD. Advanced Application of Artificial Intelligence for Pelvic Floor Ultrasound in Diagnosis and Treatment [J]. Advanced Ultrasound in Diagnosis and Therapy, 2023, 7(2): 114-121. |
[6] | Rui Chen, MM, Fangqi Guo, MM, Jia Guo, MD, Jiaqi Zhao, MD. Application and Prospect of AI and ABVS-based in Breast Ultrasound Diagnosis [J]. Advanced Ultrasound in Diagnosis and Therapy, 2023, 7(2): 130-135. |
[7] | Shujun Xia, MD, Jianqiao Zhou, MD. Ultrasound Image Generation and Modality Conversion Based on Deep Learning [J]. Advanced Ultrasound in Diagnosis and Therapy, 2023, 7(2): 136-139. |
[8] | Cancan Cui, MD, Zhaojun Li, PhD, Yanping Lin, PhD. Advances in Intelligent Segmentation and 3D/4D Reconstruction of Carotid Ultrasound Imaging [J]. Advanced Ultrasound in Diagnosis and Therapy, 2023, 7(2): 140-151. |
[9] | Wenjun Zhang, MD, Mi Zhou, PhD, Qingguo Meng, MD, Lin Zhang, MS, Xin Liu, MS, Paul Liu, PhD, Dong Liu, PhD. Rapid Screening of Carotid Plaque in Cloud Handheld Ultrasound System Based on 5G and AI Technology [J]. Advanced Ultrasound in Diagnosis and Therapy, 2023, 7(2): 152-157. |
[10] | Jiaojiao Ma, MD, Xinying Jia, MD, Guanghan Li, MD, Dandan Guo, MD, Xuehua Xi, MD, Tongtong Zhou, MD, Ji-Bin Liu, MD, Bo Zhang, MD. Development of 5G-based Remote Ultrasound Education: Current Status and Future Trends [J]. Advanced Ultrasound in Diagnosis and Therapy, 2023, 7(2): 197-203. |
[11] | Won-Chul Bang, PhD, Vice President, Yeong Kyeong Seong, PhD, Jinyong Lee. The Impact of Deep Learning on Ultrasound in Diagnosis and Therapy: Enhancing Clinical Decision Support, Workflow Efficiency, Quantification, Image Registration, and Real-time Assistance [J]. Advanced Ultrasound in Diagnosis and Therapy, 2023, 7(2): 204-216. |
[12] | Siyi Xun, MA, Wei Ke, PhD, Mingfu Jiang, MA, Huachao Chen, BA, Haoming Chen, BA, Chantong Lam, PhD, Ligang Cui, MD, Tao Tan, PhD. Current Status, Prospect and Bottleneck of Ultrasound AI Development: A Systemic Review [J]. Advanced Ultrasound in Diagnosis and Therapy, 2023, 7(2): 61-72. |
[13] | Rendong Chen, PhD, Xiaoqian Wang, BS, Ping Liang, MD, Xiaoping Ouyang, PhD, Dexing Kong, PhD. Intelligent Ultrasonic Diagnosis and Clinical Application: Technical Development and Prospectives [J]. Advanced Ultrasound in Diagnosis and Therapy, 2023, 7(2): 73-81. |
[14] | Wenjia Guo, MM, Shengli Li, MM, Xing Yu, MD, Huaxuan Wen, BM, Ying Yuan, MM, Xia Yang, MM. Artificial Intelligence in Prenatal Ultrasound: Clinical Application and Prospect [J]. Advanced Ultrasound in Diagnosis and Therapy, 2023, 7(2): 82-90. |
[15] | Priscilla Machado, MD, Ji-Bin Liu, MD, Flemming Forsberg, PhD. Sentinel Lymph Node Identification Using Contrast Lymphosonography: A Systematic Review [J]. Advanced Ultrasound in Diagnosis and Therapy, 2023, 7(1): 1-7. |
Viewed | ||||||
Full text |
|
|||||
Abstract |
|
|||||
Share: WeChat
Copyright ©2018 Advanced Ultrasound in Diagnosis and Therapy
|
Advanced Ultrasound in Diagnosis and Therapy (AUDT) a> is licensed under a Creative Commons Attribution 4.0 International License a>.