DyCON: Dynamic Uncertainty-aware Consistency and Contrastive Learning for Semi-supervised Medical Image Segmentation

CVPR 2025


Maregu Assefa, Muzammal Naseer, Iyyakutti Iyappan Ganapathi, Syed Sadaf Ali, Mohamed L Seghier, Naoufel Werghi

Department of Computer Science & C2PS*, Khalifa University of Science and Technology, Abu Dhabi, UAE

Code arXiv ! Font Awesome Free 6.4.0 by @fontawesome - https://fontawesome.com License - https://fontawesome.com/license (Commercial License) Copyright 2023 Fonticons, Inc.



Abstract


Semi-supervised learning in medical image segmentation leverages unlabeled data to reduce annotation burdens through consistency learning. However, current methods struggle with class imbalance and high uncertainty from pathology variations, leading to inaccurate segmentation in 3D medical images. To address these challenges, we present DyCON, a Dynamic Uncertainty-aware Consistency and Contrastive Learning framework that enhances the generalization of consistency methods with two complementary losses: Uncertainty-aware Consistency Loss (UnCL) and Focal Entropy-aware Contrastive Loss (FeCL). UnCL enforces global consistency by dynamically weighting the contribution of each voxel to the consistency loss based on its uncertainty, preserving high-uncertainty regions instead of filtering them out. Initially, UnCL prioritizes learning from uncertain voxels with lower penalties, encouraging the model to explore challenging regions. As training progress, the penalty shift towards confident voxels to refine predictions and ensure global consistency. Meanwhile, FeCL enhances local feature discrimination in imbalanced regions by introducing dual focal mechanisms and adaptive confidence adjustments into the contrastive principle. These mechanisms jointly prioritizes hard positives and negatives while focusing on uncertain sample pairs, effectively capturing subtle lesion variations under class imbalance. Extensive evaluations on four diverse medical image segmentation datasets (ISLES'22, BraTS'19, LA, Pancreas) show DyCON's superior performance against SOTA methods



The Proposed Method


...


This paper introduces DyCON, a semi-supervised learning framework addressing two critical challenges in medical image segmentation: extreme class imbalance and high uncertainty caused by pathological variability. DyCON integrates two complementary losses: Uncertainty-aware Consistency Loss (UnCL), which dynamically balances learning focus between uncertain and confident regions through entropy-based weighting, and Focal Entropy-aware Contrastive Loss (FeCL), which refines local feature discrimination by prioritizing challenging positive and negative samples under conditions of class imbalance. Together, these mechanisms ensure robust segmentation across diverse medical datasets, significantly surpassing existing state-of-the-art semi-supervised methods.


Results


Extensive experiments on diverse medical segmentation datasets demonstrate DyCON's superior generalization and substantial improvements over state-of-the-art methods, particularly in segmenting scattered and challenging lesions.

DyCON Performance Comparison

Figure 1: Quantitative comparison of DyCON integrated into Mean-Teacher (MT) and Co-Training (CT) frameworks
using 10% labeled data on ISLES'22 and BraTS'19 datasets.




DyCON Segmentation Results

Figure 2: Qualitative segmentation results showing DyCON's improved accuracy in identifying lesions and tumors compared to existing methods.




DyCON Uncertainty Maps

Figure 3: Evolution of uncertainty maps generated by UnCL loss, demonstrating progressive refinement of predictions over training epochs.





Citation


If you use the findings of this research in your work, please cite our paper:
@article{assefa2025dycon,
      title={DyCON: Dynamic Uncertainty-aware Consistency and Contrastive Learning for Semi-supervised Medical Image Segmentation},
      author={Maregu Assefa, Muzammal Naseer, Iyyakutti Iyappan Ganapathi, Syed Sadaf Ali, Mohamed L Seghier, Naoufel Werghi}, 
      booktitle={CVPR},
      year={2025}
    }


Contact


For any inquiries regarding DyCON, please feel free to contact Maregu Assefa at maregu.habtie@ku.ac.ae