Each data sample is a 2 dimensional point with coordinates x, y. The eigenvectors of the covariance matrix of these data samples are the vectors u and v; u, longer arrow, is the first eigenvector and v, the shorter arrow, is the second. (The eigenvalues are the length of the arrows.) As you can see, the first eigenvector points (from the mean of the data) in the direction in which the data varies the most in Euclidean space, and the second eigenvector is orthogonal (perpendicular) to the first.

It's a little trickier to visualize in 3 dimensions, but here's an attempt [2]:

What is an eigenvector of a covariance matrix?

In this case, imagine that all of the data points lie within the ellipsoid. v1, the direction in which the data varies the most, is the first eigenvector (lambda1 is the corresponding eigenvalue). v2 is the direction in which the data varies the most among those directions that are orthogonal to v1. And v3 is the direction of greatest variance among those directions that are orthogonal to v1 and v2 (though there is only one such orthogonal direction). 

[1] Image taken from Duncan Gillies's lecture on Principal Component Analysis
[2] Image taken from Fiber Crossing in Human Brain Depicted with Diffusion Tensor MR Imaging
  
 
409
10+
 
What is an eigenvector of a covariance matrix?Anonymous
(more)
 
n features, you can find eigenvectors of the covariance matrix of the features. This allows you to represent the data with uncorrelated features. Moreover, the eigenvalues tell you the amount of variance in each feature, allowing you to choose a subset of the features that retain the most information about your data.
  
 
7
Comment
 
 

It becomes a little more complicated if the covariance matrix is not diagonal, such that the covariances are not zero. In this case, the principal components (directions of largest variance) do no coincide with the axes, and the data is rotated. The eigenvalues then still correspond to the spread of the data in the direction of the largest variance, whereas the variance components of the covariance matrix still defines the spread of the data along the axes:

What is an eigenvector of a covariance matrix?

An in-depth discussion of how the covariance matrix can be interpreted from a geometric point of view (and the source of the above images) can be found on:A geometric interpretation of the covariance matrix
  
 
6
Comment
 
Shreyas Ghuge
Ram Shankar
 
Finding the directions of maximum and minimum variance is the same as looking for where the orthogonal least squares best fit line and plane of the data. The sums of squares for that line and plane can be written in terms of covariance matrix.The connections between them can be worked out to get the Eigen vectors of this covariance matrix.
  
 
3
Comment
 
 
Finding the eigenvectors a covariance matrix is exactly the technique of Principal Component Analysis (PCA).

The eigenvectors are those variables that are linearly uncorrelated.
  
 
4
Comment
 
 
 
Write an answer

相关文章: