files/journal/2022-09-02_12-20-40-000000_622.png

International Journal of Soft Computing

ISSN: Online
ISSN: Print 1816-9503
110
Views
0
Downloads

Improving Classification Performance by Using Feature Selection with Resampling

Raya Ismail, Sherihan Abuelenin and Ahmed Aboelfetouh
Page: 255-269 | Received 21 Sep 2022, Published online: 21 Sep 2022

Full Text Reference XML File PDF File

Abstract

Feature selection methods tend to identify the most relevant features for classification and can be categorized as either subset selection (wrapper) methods or ranking (filter) methods. The main purpose of this stusy is to prove that a feature selection preprocessing step could enhance classifiers performance by eliminating redundant features. The proposed method consists of three stages; the first refines sample space domain by resample filtering, the second minimizes feature space by applying subset evaluation algorithm and the third measures the goodness of the resulting set of features using different classifiers. Tow experiments carried out on the data sets from UCI repository. The proposed method is evaluated by measuring the accuracy, number of selected features, precision, recall, f-measure, ROC area, time to build model, error rate and relative absolute error. Tests are done on two main types of classifiers Naïve Bayes and its variance NBTREE, NBNET and J48 with other tree classifiers Random Forest, BFTREE.


How to cite this article:

Raya Ismail, Sherihan Abuelenin and Ahmed Aboelfetouh. Improving Classification Performance by Using Feature Selection with Resampling.
DOI: https://doi.org/10.36478/ijscomp.2016.255.269
URL: https://www.makhillpublications.co/view-article/1816-9503/ijscomp.2016.255.269