21-03-2012, 02:54 PM
Improved Robust PCA Component Analysis with Correntropy Criteria and Expectation Maximization
PHASE1.doc (Size: 160 KB / Downloads: 62)
INTRODUCTION
Principal Component Analysis belongs to linear transforms that plays a major role in image processing. It has been used for representing high dimensional data and pattern recognition which is mostly used in signal and image processing as a technique for data compression, data reduction and so on. However, PCA also suffers from certain limitations. Assumption has to be made for the directions with the larges variance. We only consider the orthogonal transformations or rotations of the original variables. PCA is also based only on the mean vector variable and the data that belongs to the covariance matrix. Some distributions are considered while the others other than multivariate normal are not considered.
1.1 PCA – Application
The total number of principal components is less than or equal to the total number of original variables. Principal components are possibly to be independent only if the data set is jointly normally distributed. PCA is sensitive to the relative scaling of the original variables. PCA is mostly used as a tool in exploratory data analysis and for making predictive models. PCA can be done with the help of eigenvalue decomposition of a data covariance matrix or singular value decomposition of a data matrix, usually after mean centering the data for each attribute.
L1-norm PCA
In order to reduce the negative effect of outliers different methods have been proposed. L1-norm PCA was derived by applying maximum-likelihood estimation to the input data. The L1-norm PCA has been applied in various fields of principal component analysis. It was an alternative to traditional L2-norm PCA because it provides robustness in the presence of outliers. Heuristic estimation method is also applied to improve the efficiency. It reduces the running time with same accuracy. Convex methods have been applied to detect outliers. Non convex M-estimators are applied to learn robust representation of color images. Though robustness is present in th above three methods, it faces the limitations as they are not rotationally invariant that is the fundamental property of learning algorithms. Hence rotationally invariant PCA algorithm are developed.
Φ- PCA
Φ- PCA derives the objective function as a twice-differentiable and convex function which can be optimized by the Newton gradient algorithm. Its an iterative method used to find out the iterative method for finding the different roots of equation. It is also used to find critical points of differentiable functions which are the zeros of the derivative function. Newtons gradient method attempts to find the sequence from an initial guess. Φ- PCA needs to calculate the Hessian matrix, which is simply the square matrix of second order partial derivatives of a function which derives the local curvature of the function of many variables.