BBW: a batch balance wrapper for training deep neural networks on extremely imbalanced datasets with few minority samples

Jingzhao Hu, Hao Zhang, Yang Liu, Richard Sutcliffe, Jun Feng

Research output: Contribution to journalArticlepeer-review

Abstract

In recent years, Deep Neural Networks (DNNs) have achieved excellent performance on many tasks, but it is very difficult to train good models from imbalanced datasets. Creating balanced batches either by majority data down-sampling or by minority data up-sampling can solve the problem in certain cases. However, it may lead to learning process instability and overfitting. In this paper, we propose the Batch Balance Wrapper (BBW), a novel framework which can adapt a general DNN to be well trained from extremely imbalanced datasets with few minority samples. In BBW, two extra network layers are added to the start of a DNN. The layers prevent overfitting of minority samples and improve the expressiveness of the sample distribution of minority samples. Furthermore, Batch Balance (BB), a class-based sampling algorithm, is proposed to make sure the samples in each batch are always balanced during the learning process. We test BBW on three well-known extremely imbalanced datasets with few minority samples. The maximum imbalance ratio reaches 1167:1 with only 16 positive samples. Compared with existing approaches, BBW achieves better classification performance. In addition, BBW-wrapped DNNs are 16.39 times faster, relative to unwrapped DNNs. Moreover, BBW does not require data preprocessing or additional hyper-parameter tuning, operations that may require additional processing time. The experiments prove that BBW can be applied to common applications of extremely imbalanced data with few minority samples, such as the classification of EEG signals, medical images and so on.

Original languageEnglish
Pages (from-to)6723-6738
Number of pages16
JournalApplied Intelligence
Volume52
Issue number6
DOIs
Publication statusPublished - Apr 2022
Externally publishedYes

Keywords

  • Batch balance wrapper framework
  • Deep learning
  • Deep neural networks
  • Imbalanced dataset

Fingerprint

Dive into the research topics of 'BBW: a batch balance wrapper for training deep neural networks on extremely imbalanced datasets with few minority samples'. Together they form a unique fingerprint.

Cite this