TY - JOUR
T1 - MetaSeg
T2 - Content-Aware Meta-Net for Omni-Supervised Semantic Segmentation
AU - Jiang, Shenwang
AU - Li, Jianan
AU - Wang, Ying
AU - Wu, Wenxuan
AU - Zhang, Jizhou
AU - Huang, Bo
AU - Xu, Tingfa
N1 - Publisher Copyright:
IEEE
PY - 2023
Y1 - 2023
N2 - Noisy labels, inevitably existing in pseudo-segmentation labels generated from weak object-level annotations, severely hamper model optimization for semantic segmentation. Previous works often rely on massive handcrafted losses and carefully tuned hyperparameters to resist noise, suffering poor generalization capability and high model complexity. Inspired by recent advances in meta-learning, we argue that rather than struggling to tolerate noise hidden behind clean labels passively, a more feasible solution would be to find out the noisy regions actively, so as to simply ignore them during model optimization. With this in mind, this work presents a novel meta-learning-based semantic segmentation method, MetaSeg, that comprises a primary content-aware meta-net (CAM-Net) to serve as a noise indicator for an arbitrary segmentation model counterpart. Specifically, CAM-Net learns to generate pixel-wise weights to suppress noisy regions with incorrect pseudo-labels while highlighting clean ones by exploiting hybrid strengthened features from image content, providing straightforward and reliable guidance for optimizing the segmentation model. Moreover, to break the barrier of time-consuming training when applying meta-learning to common large segmentation models, we further present a new decoupled training strategy that optimizes different model layers in a divide-and-conquer manner. Extensive experiments on object, medical, remote sensing, and human segmentation show that our method achieves superior performance, approaching that of fully supervised settings, which paves a new promising way for omni-supervised semantic segmentation.
AB - Noisy labels, inevitably existing in pseudo-segmentation labels generated from weak object-level annotations, severely hamper model optimization for semantic segmentation. Previous works often rely on massive handcrafted losses and carefully tuned hyperparameters to resist noise, suffering poor generalization capability and high model complexity. Inspired by recent advances in meta-learning, we argue that rather than struggling to tolerate noise hidden behind clean labels passively, a more feasible solution would be to find out the noisy regions actively, so as to simply ignore them during model optimization. With this in mind, this work presents a novel meta-learning-based semantic segmentation method, MetaSeg, that comprises a primary content-aware meta-net (CAM-Net) to serve as a noise indicator for an arbitrary segmentation model counterpart. Specifically, CAM-Net learns to generate pixel-wise weights to suppress noisy regions with incorrect pseudo-labels while highlighting clean ones by exploiting hybrid strengthened features from image content, providing straightforward and reliable guidance for optimizing the segmentation model. Moreover, to break the barrier of time-consuming training when applying meta-learning to common large segmentation models, we further present a new decoupled training strategy that optimizes different model layers in a divide-and-conquer manner. Extensive experiments on object, medical, remote sensing, and human segmentation show that our method achieves superior performance, approaching that of fully supervised settings, which paves a new promising way for omni-supervised semantic segmentation.
KW - Annotations
KW - Computational modeling
KW - Label noise
KW - Mathematical models
KW - Noise measurement
KW - Semantic segmentation
KW - Semantics
KW - Training
KW - meta-learning
KW - omni-supervised segmentation
UR - http://www.scopus.com/inward/record.url?scp=85153370750&partnerID=8YFLogxK
U2 - 10.1109/TNNLS.2023.3263335
DO - 10.1109/TNNLS.2023.3263335
M3 - Article
AN - SCOPUS:85153370750
SN - 2162-237X
SP - 1
EP - 13
JO - IEEE Transactions on Neural Networks and Learning Systems
JF - IEEE Transactions on Neural Networks and Learning Systems
ER -