Feature selection is a technique of selecting a subset of relevant features for building robust learning models. In this paper, we developed a data dimensionality reduction approach using sparsified singular value decomposition (SSVD) technique to identify and remove trivial features before applying any advanced feature selection algorithm. First, we investigated how SSVD can be used to identify and remove nonessential features in order to facilitate feature selection performance. Second, we analyzed the application limitations and computing complexity. Next, a set of experiments were conducted and the empirical results show that applying feature selection techniques on the data of which the nonessential features are removed by the data dimensionality reduction approach generally results in better performance with significantly reduced computing time.
|Title of host publication||Proceedings of the International Joint Conference on Neural Networks|
|Number of pages||8|
|State||Published - Sep 3 2014|
|Event||2014 International Joint Conference on Neural Networks, IJCNN 2014 - Beijing, China|
Duration: Jul 6 2014 → Jul 11 2014
|Name||Proceedings of the International Joint Conference on Neural Networks|
|Conference||2014 International Joint Conference on Neural Networks, IJCNN 2014|
|Period||7/6/14 → 7/11/14|
Bibliographical notePublisher Copyright:
© 2014 IEEE.
ASJC Scopus subject areas
- Artificial Intelligence