AUTHOR=Liu Yang , Lian Lijin , Zhang Ersi , Xu Lulu , Xiao Chufan , Zhong Xiaoyun , Li Fang , Jiang Bin , Dong Yuhan , Ma Lan , Huang Qiming , Xu Ming , Zhang Yongbing , Yu Dongmei , Yan Chenggang , Qin Peiwu TITLE=Mixed-UNet: Refined class activation mapping for weakly-supervised semantic segmentation with multi-scale inference JOURNAL=Frontiers in Computer Science VOLUME=4 YEAR=2022 URL=https://www.frontiersin.org/journals/computer-science/articles/10.3389/fcomp.2022.1036934 DOI=10.3389/fcomp.2022.1036934 ISSN=2624-9898 ABSTRACT=

Deep learning techniques have shown great potential in medical image processing, particularly through accurate and reliable image segmentation on magnetic resonance imaging (MRI) scans or computed tomography (CT) scans, which allow the localization and diagnosis of lesions. However, training these segmentation models requires a large number of manually annotated pixel-level labels, which are time-consuming and labor-intensive, in contrast to image-level labels that are easier to obtain. It is imperative to resolve this problem through weakly-supervised semantic segmentation models using image-level labels as supervision since it can significantly reduce human annotation efforts. Most of the advanced solutions exploit class activation mapping (CAM). However, the original CAMs rarely capture the precise boundaries of lesions. In this study, we propose the strategy of multi-scale inference to refine CAMs by reducing the detail loss in single-scale reasoning. For segmentation, we develop a novel model named Mixed-UNet, which has two parallel branches in the decoding phase. The results can be obtained after fusing the extracted features from two branches. We evaluate the designed Mixed-UNet against several prevalent deep learning-based segmentation approaches on our dataset collected from the local hospital and public datasets. The validation results demonstrate that our model surpasses available methods under the same supervision level in the segmentation of various lesions from brain imaging.