GRD Journals | Global Research and Development Journal for Engineering | International Conference on Innovations in Engineering and Technology (ICIET) - 2016 | July 2016
e-ISSN: 2455-5703
Hybrid Dimensionality Reduction Method Using Kaiser Component Analysis and Independent Component Analysis 1K.
Arunasakthi 2J.Sulthan Alikhan 1,2 Assistant Professor 1,2 K.L.N. College of Engineering, Pottapalayam, Sivagangai 630612, India Abstract Dimensionality Reduction is the process of extracting the more relevant information. Conventional dimensionality reduction is categorized into two methods like Stand alone and Hybrid method. Standalone dimensionality reduction reduces the dimensions based on a single criterion whereas Hybrid method combines two or more criterion. In this paper, we proposed new hybrid method for dimensionality reduction Using Kaiser Component Analysis (KCA) and Independent Component Analysis (ICA). Kaiser component Analysis extracts the uncorrelated information and Independent Component Analysis maximizing the Independency among the data. The hybrid method using these Kaiser Component Analysis and Independent Component Analysis achieves both correlations and Independency among the Information and it is applied on SVM classification. The result improves the accuracy of the classification. Keyword- Dimensionality reduction, Kaiser Component Analysis, Independent Component Analysis, Stand alone and Hybrid methods, SVM classification. __________________________________________________________________________________________________
I. INTRODUCTION A. Dimensionality Reduction Dimensionality reduction is a process of extracting the essential information from the data. The high-dimensional data can be represented in a more condensed form with much lower dimensionality to both improve classification accuracy and reduce computational complexity. Due to the increasing demand for high-dimensional data analysis from various applications such as electrocardiogram signal analysis and content-based image retrieval, dimensionality reduction becomes a viable process to provide robust data representation in relatively low-dimensional space [2]. Dimensionality reduction is an important pre-processing step in many applications of data mining, machine learning, and pattern recognition, due to the so-called curse of dimensionality. B. Need for Dimensionality Reduction High-dimensional dataset presented many mathematical challenges. One of the problems with high-dimensional datasets is that, in many cases, not all the measured variables were important to understand the problem. The main purpose of feature selection is to reduce the number of features used in classification while maintaining acceptable classification accuracy. Less discriminatory features were eliminated, leaving a subset of the original features which retained the sufficient information to discriminate well among classes. Feature extraction is a more general method in which the original set of features is transformed to provide a new set of features. In mathematical terms, the problem we investigate can be stated as follows: given the p-dimensional random variable X = (x1 . . . . xp) , and a lower dimensional representation of it, S = (s1. . . . . sk), that captures the content in the original data, according to some criteria. C. Dimensionality Reduction methods Dimensionality reduction reduces the number of variables to improve the performance of the classification. High dimensional data is the major problem in many applications which increase the complexity by taking the more execution time. Conventionally dimensionality reduction is categorized into two methods: Stand Alone method and Hybrid method. In standalone method, dimensionality reduction was done by using single criteria but in the case of hybrid approach dimensionality reduction is achieved based on two or more criteria. There are number of techniques available for dimensionality reduction. Each and every technique reduces the dimensions of the data based on particular criteria. In recent years, Principal Component Analysis (PCA) and Linear Discriminant Analysis
All rights reserved by www.grdjournals.com
415