Publication: Three-Layer Stacked Generalization Architecture With Simulated Annealing for Optimum Results in Data Mining
Files
Type:
Article
Date
2021-07-01
Journal Title
Journal ISSN
Volume Title
Publisher
IGI Global
Abstract
The combination of different machine learning models to a single prediction model usually improves
the performance of the data analysis. Stacking ensembles are one of such approaches to build a highperformance classifier that can be applied to various contexts of data mining. This study proposes
an enhanced stacking ensemble by collating a few machine learning algorithms with two-layered
meta classifications to address the limitations of existing stacking architecture to utilize simulated
annealing algorithm to optimize the classifier configuration in order to reach the best prediction
accuracy. The proposed method significantly outperformed three general stacking ensembles of two
layers that have been executed using the meta classifiers utilized in the proposed architecture. These
assessments have been statistically proven at a 95% confidence level. The novel stacking ensemble has
also outperformed the existing ensembles named Adaboost algorithm, gradient boosting algorithm,
XGBoost classifier, and bagging classifiers as well.
Description
Keywords
Classifier, Ensemble, Hyperparameter, Simulated Annealing, Stacked Generalization
Citation
Kasthuriarachchi, K. T., & Liyanage, S. R. (2021). Three-Layer Stacked Generalization Architecture With Simulated Annealing for Optimum Results in Data Mining. International Journal of Artificial Intelligence and Machine Learning (IJAIML), 11(2), 1-27. http://doi.org/10.4018/IJAIML.20210701.oa10
