Incremental Focal ENsemble for multi-class Imbalalanced Learning (FENIL)

Message:
Article Type:
Research/Original Article (دارای رتبه معتبر)
Abstract:

Convolutional neural networks are considered as one of the most popular machine learning models in data classification. despite their significant success in data classification, they do not produce acceptable results when working with imbalanced data.  Imbalanced learning is one of the most challenging issues in machine learning, since in these problems, samples of one or more classes are usually much more than others, or misclassification costs are not equal for all classes, while CNN networks assume the distribution of classes and the cost of misclassification to be equal. The ensemble method is a popular method to deal with imbalanced data sets, which can achieve high accuracy by combining several basic estimators, and in comparison with using only one estimator, the reliability of the model could be improved. ensemble methods empower machine learning models to deal with imbalanced data. In this research, we have introduced a method based on ensemble learning for convolutional neural networks, which uses the cascade of CNN networks to work with imbalanced data. We use the focal loss function to train CNNs, the gamma parameter in the loss function determines the importance of hard and easy samples. CNNi+1 gives less importance to easy samples than hard ones in comparison with CNNi, this is done by increasing the gamma step by step (increasing γi+1 compared to γi). In our proposed FENIL ensemble network (Incremental Focal Ensemble method for multi-class Imbalalanced Learning), weights of the training data for CNNi+1 are determined by the classification result of the previous CNN i.e. CNNi. The combination of all CNNs is used to classify the new data. We applied our proposed FENIL ensemble network to several benchmark data sets. the results showed that the FENIL network not only has much higher accuracy and F1-score (18.63, 19.61 higher) In comparison with non-deep methods such as decision tree AdaBoost but also obtained better results In comparison with other common deep methods for imbalanced learning.

Language:
Persian
Published:
Journal of Command and Control Communications Computer Intelligence, Volume:6 Issue: 2, 2022
Pages:
60 to 77
https://www.magiran.com/p2579558  
سامانه نویسندگان
  • Shirazi، Hossein
    Corresponding Author (2)
    Shirazi, Hossein
    Professor Faculty of Electrical and Computer, Malek-Ashtar University Of Technology, تهران, Iran
  • Dadashtabar Ahmadi، Kourosh
    Author (4)
    Dadashtabar Ahmadi, Kourosh
    Assistant Professor AI, Malek-Ashtar University Of Technology, تهران, Iran
اطلاعات نویسنده(گان) توسط ایشان ثبت و تکمیل شده‌است. برای مشاهده مشخصات و فهرست همه مطالب، صفحه رزومه را ببینید.
مقالات دیگری از این نویسنده (گان)