Faculty of Computing
Permanent URI for this communityhttps://rda.sliit.lk/handle/123456789/4202
Browse
2 results
Search Results
Publication Open Access Three Layer Super Learner Ensemble with Hyperparameter optimization to improve the performance of Machine Learning model(Faculty of Technology, USJ, 2021-03-13) Kasthuriarachchi, K. T. S; Liyanage, S. RA combination of different machine learning models to form a super learner can definitely lead to improved predictions in any domain. The super learner ensemble discussed in this study collates several machine learning models and proposes to enhance the performance by considering the final meta- model accuracy and the prediction duration. An algorithm is proposed to rate the machine learning models derived by combining the base classifiers voted with different weights. The proposed algorithm is named as Log Loss Weighted Super Learner Model (LLWSL). Based on the voted weight, the optimal model is selected and the machine learning method derived is identified. The meta- learner of the super learner uses them by tuning their hyperparameters. The execution time and the model accuracies were evaluated using two separate datasets inside LMSSLIITD extracted from the educational industry by executing the LLWSL algorithm. According to the outcome of the evaluation process, it has been noticed that there exists a significant improvement in the proposed algorithm LLWSL for use in machine learning tasks for the achievement of better performances.Publication Open Access Three-Layer Stacked Generalization Architecture With Simulated Annealing for Optimum Results in Data Mining(IGI Global, 2021-07-01) Kasthuriarachchi, K. T. S; Liyanage, S. RThe combination of different machine learning models to a single prediction model usually improves the performance of the data analysis. Stacking ensembles are one of such approaches to build a highperformance classifier that can be applied to various contexts of data mining. This study proposes an enhanced stacking ensemble by collating a few machine learning algorithms with two-layered meta classifications to address the limitations of existing stacking architecture to utilize simulated annealing algorithm to optimize the classifier configuration in order to reach the best prediction accuracy. The proposed method significantly outperformed three general stacking ensembles of two layers that have been executed using the meta classifiers utilized in the proposed architecture. These assessments have been statistically proven at a 95% confidence level. The novel stacking ensemble has also outperformed the existing ensembles named Adaboost algorithm, gradient boosting algorithm, XGBoost classifier, and bagging classifiers as well.
