Feature Selection with Mutual Information for Regression Problems

Muhammad Aliyu, Sulaiman and Jane, Labadin (2015) Feature Selection with Mutual Information for Regression Problems. In: 2015 9th International Conference on IT in Asia (CITA) : Transforming Big Data into Knowledge, 4-5 August 2015, Kuching, Sarawak Malaysia.

[img]
Preview
PDF
Feature Selection with Mutual Information (abstract).pdf

Download (70kB) | Preview

Abstract

Selecting relevant features for machine learning modeling improves the performance of the learning methods. Mutual information (MI) is known to be used as relevant criterion for selecting feature subsets from input dataset with a nonlinear relationship to the predicting attribute. However, mutual information estimator suffers the following limitation; it depends on smoothing parameters, the feature selection greedy methods lack theoretically justified stopping criteria and in theory it can be used for both classification and regression problems, however in practice more often it formulation is limited to classification problems. This paper investigates a proposed improvement on the three limitations of the Mutual Information estimator (as mentioned above), through the use of resampling techniques and formulation of mutual information based on differential entropic for regression problems.

Item Type: Conference or Workshop Item (Paper)
Uncontrolled Keywords: Mutual Information; Feature Selection; Regression Problems, unimas, university, universiti, Borneo, Malaysia, Sarawak, Kuching, Samarahan, ipta, education, research, Universiti Malaysia Sarawak
Subjects: T Technology > T Technology (General)
Divisions: Academic Faculties, Institutes and Centres > Faculty of Computer Science and Information Technology
Depositing User: Karen Kornalius
Date Deposited: 08 Sep 2016 19:31
Last Modified: 14 Feb 2017 05:47
URI: http://ir.unimas.my/id/eprint/13447

Actions (For repository members only: login required)

View Item View Item