Peer uest
Contact Help
Home
 
Principal component analysis (PCA) explained

Principal component analysis is popular method to reduce high dimension data.

Let \(X_{i}=(x_{1i},x_{2i},...,x_{k,i}) ^{'}\) be a k-dimension vector for each  \(i=1,2,...,N\)

  1. Calculate the \(K\times K\) covariance matrix \(Q\) of vector \(y_k=(x_{k1},x_{k2}, ..., x_{kN})^{'}, k=1,2,...,K\)
  2. Calculate the eigenvalue of \(Q\)\(\lambda_1\geq \lambda_2\geq ...>\lambda_K\). Let the corresponding eigenvectors be \(\xi_1,...,\xi_K\) (unit length). The eigenvvectors are the new coordinates.
  3. Project \(X_i\) onto subspace generated by \(\xi_1,...,\xi_L\)

                                 \(\prod_{\xi_1,...,\xi_L}X_i=a_{1i}\xi_1+a_{2i}\xi_2+...+a_{L,i}\xi_{L}\)

           

           where \(z_{li}=\langle X_i,\xi_l\rangle=X_i^{'}\xi_l\)

.

Add a Comment
×
×
×