Bi-Level Optimization for Semi-Supervised Learning with Pseudo-Labeling

0citations
PDFProject
0
citations
#2074
in AAAI 2025
of 3028 papers
2
Top Authors
2
Data Points

Abstract

Semi-supervised learning (SSL) is a fundamental task in machine learning, empowering models to extract valuable insights from datasets with limited labeled samples and a large amount of unlabeled data. Although pseudo-labeling is a widely used approach for SSL that generates pseudo-labels for unlabeled data and leverages them as ground truth labels for training, traditional pseudo-labeling techniques often face challenges that significantly decrease the quality of pseudo-labels and hence the overall model performance. In this paper, we propose a novel Bi-level Optimization method for Pseudo-label Learning (BOPL) to boost semi-supervised training. It treats pseudo-labels as latent variables, and optimizes the model parameters and pseudo-labels jointly within a bi-level optimization framework. By enabling direct optimization over the pseudo-labels towards maximizing the prediction model performance, the method is expected to produce high-quality pseudo-labels. To evaluate the effectiveness of the proposed approach, we conduct extensive experiments on multiple SSL benchmarks. The experimental results show the proposed BOPL outperforms the state-of-the-art SSL techniques.

Citation History

Jan 27, 2026
0
Feb 7, 2026
0