RSG: A Simple yet Effective Module for Learning Imbalanced Datasets
Jianfeng Wang‚ Thomas Lukasiewicz‚ Xiaolin Hu‚ Jianfei Cai and Zhenghua Xu
Imbalanced datasets widely exist in practice and are a great challenge for training deep neural models with a good generalization on infrequent classes. In this work, we propose a new rare-class sample generator (RSG) to solve this problem. RSG aims to generate some new samples for rare classes during training, and it has in particular the following advantages: (1) it is convenient to use and highly versatile, because it can be easily integrated into any kind of convolutional neural network, and it works well when combined with different loss functions, and (2) it is only used during the training phase, and therefore, no additional burden is imposed on deep neural networks during the testing phase. Furthermore, we also propose a new loss function to help with optimizing RSG, namely, maximized vector loss. In extensive experimental evaluations, we verify the effectiveness of RSG. Furthermore, by leveraging RSG, we obtain new state-of-the-art results on three public benchmarks, namely, Imbalanced CIFAR, Places-LT, and iNaturalist 2018.