AUTHOR=Liu Na , Wang Jiaqi , Zhu Yongtong , Wan Lihong , Li Qingdu TITLE=Improving imbalance classification via ensemble learning based on two-stage learning JOURNAL=Frontiers in Computational Neuroscience VOLUME=17 YEAR=2024 URL=https://www.frontiersin.org/journals/computational-neuroscience/articles/10.3389/fncom.2023.1296897 DOI=10.3389/fncom.2023.1296897 ISSN=1662-5188 ABSTRACT=

The excellent performance of deep neural networks on image classification tasks depends on a large-scale high-quality dataset. However, the datasets collected from the real world are typically biased in their distribution, which will lead to a sharp decline in model performance, mainly because an imbalanced distribution results in the prior shift and covariate shift. Recent studies have typically used a two-stage learning method consisting of two rebalancing strategies to solve these problems, but the combination of partial rebalancing strategies will damage the representational ability of the networks. In addition, the two-stage learning method is of little help in addressing the problem of covariate shift. To solve the above two issues, we first propose a sample logit-aware reweighting method called (SLA), which can not only repair the weights of majority class hard samples and minority class samples but will also integrate with logit adjustment to form a stable two-stage learning strategy. Second, to solve the covariate shift problem, inspired by ensemble learning, we propose a multi-domain expert specialization model, which can achieve a more comprehensive decision by averaging expert classification results from multiple different domains. Finally, we combine SLA and logit adjustment into a two-stage learning method and apply our model to the CIFAR-LT and ImageNet-LT datasets. Compared with the most advanced methods, our experimental results show excellent performance.