ZHANG Zhe, WANG Bilin, YU Zhezhou, ZHAO Fengzhi. Attention Guided Enhancement Network for Weakly Supervised Semantic Segmentation[J]. Chinese Journal of Electronics, 2023, 32(4): 896-907. DOI: 10.23919/cje.2021.00.230
Citation: ZHANG Zhe, WANG Bilin, YU Zhezhou, ZHAO Fengzhi. Attention Guided Enhancement Network for Weakly Supervised Semantic Segmentation[J]. Chinese Journal of Electronics, 2023, 32(4): 896-907. DOI: 10.23919/cje.2021.00.230

Attention Guided Enhancement Network for Weakly Supervised Semantic Segmentation

  • Weakly supervised semantic segmentation using only image-level labels is critical since it alleviates the need for expensive pixel-level labels. Most cutting-edge methods adopt two-step solutions that learn to produce pseudo-ground-truth using only image-level labels and then train off-the-shelf fully supervised semantic segmentation network with these pseudo labels. Although these methods have made significant progress, they also increase the complexity of the model and training. In this paper, we propose a one-step approach for weakly supervised image semantic segmentation—attention guided enhancement network (AGEN), which produces pseudo-pixel-level labels under the supervision of image-level labels and trains the network to generate segmentation masks in an end-to-end manner. Particularly, we employ class activation maps (CAM) produced by different layers of the classification branch to guide the segmentation branch to learn spatial and semantic information. However, the CAM produced by the lower layer can capture the complete object region but with many noises. Thus, the self-attention module is proposed to enhance object regions adaptively and suppress irrelevant object regions, further boosting the segmentation performance. Experiments on the Pascal VOC 2012 dataset demonstrate that AGEN outperforms alternative state-of-the-art weakly supervised semantic segmentation methods exclusively relying on image-level labels.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return