answersLogoWhite

0


Best Answer

Let l1,...,lr be distinct eigenvalues of an nxn matrix A, and let v1,...,vr be the corresponding eigenvectors. This proof is by induction:

The case r=1 is obvious since k1 is not the zero vector.

Suppose that the theorem holds true for r=m-1, i.e. {k1,...,km-1} is linearly independent. This means

c1k1+c2k2+...+cm-1km-1=0 (1)

iff ci=0 for all i. Now suppose that

a1k1+a2k2+...+am-1km-1+amkm=0 (2)

If am is not equal to 0, then

km=b1k1+b2k2+...+bm-1km-1 (3)

where bi=-ai/am. Multiplying both sides of (3) by lmI-A (where I is the nxn identity matrix) and using the fact that Aki=liki, we have:

0=(l1-lm)b1k1+(l2-lm)b2k2+...+(lm-1-lm)bm-1km-1.

By our induction hypothesis we have that (li-lm)bi=0 for each i. Thus by (3) km is the zero vector, which is a contradiction since it is an eigenvector. Thus it must be that am is zero. If am=0 in (2), then all of the ai=0 by the induction hypothesis. Thus {k1,...,km} is linearly independent. By induction this is true for all 1<=r<=n.

User Avatar

Wiki User

13y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Show that eigen vectors corresponding to distinct eigen values of matrix A are linearly independent?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about Other Math

If a set of vectors spans R3 then the set is linearly independent?

No it is not. It's possible to have to have a set of vectors that are linearly dependent but still Span R^3. Same holds true for reverse. Linear Independence does not guarantee Span R^3. IF both conditions are met then that set of vectors is called the Basis for R^3. So, for a set of vectors, S, to be a Basis it must be:(1) Linearly Independent(2) Span S = R^3.This means that both conditions are independent.


1 1 Check whether the following set of vectors is LD or LI?

(i) They are linearly dependent since the 2nd vector is twice the 1st vector. All 3 vectors lie in the x-z plane, so they don't span 3D space. (ii) They are linearly independent. Note that the cross-product of the first two is (-1,1,1). If the third vector is not perpendicular to the above cross-product, then the third vector does not lie in the plane defined by the first two vectors. (-1,1,1) "dot" (1,1,-1) = -1+1-1 = -1, not zero, so 3rd vector is not perpendicular to the cross product of the other two.


When are vectors said to be perpendicular?

Perpendicular means that the angle between the two vectors is 90 degrees - a right angle. If you have the vectors as components, just take the dot product - if the dot product is zero, that means either that the vectors are perpendicular, or that one of the vectors has a magnitude of zero.


When are two vectors identical?

Two vectors are identical when all their components are identical. An alternative definition, for vectors used in physics, is that they are identical when both the magnitude and the direction are identical.


Can the directions of the sum of two two vectors be equal to the directions of difference of two vectors?

Yes.

Related questions

Show that only N orthogonal vectors can be formed from N linearly independent vectors?

shut up now


If a set of vectors spans R3 then the set is linearly independent?

No it is not. It's possible to have to have a set of vectors that are linearly dependent but still Span R^3. Same holds true for reverse. Linear Independence does not guarantee Span R^3. IF both conditions are met then that set of vectors is called the Basis for R^3. So, for a set of vectors, S, to be a Basis it must be:(1) Linearly Independent(2) Span S = R^3.This means that both conditions are independent.


How do you find the dimension of the subspace of R4 consisting of the vectors a plus 2b plus c b-2c 2a plus 2b plus c 3a plus 5b plus c?

The dimension of a space is defined as the number of vectors in its basis. Assuming your vectors are 1,2,1,0 0,1,-2,0 2,2,1,0 and 3,5,1,0 (extra zeros because you are in R4) then you must first check to see if they are linearly indepent. If all the vectors are linearly independent then the subspace defined by those vectors has a dimension 4, as there are 4 vectors in the basis.


How can you tell if a matrix is invertible?

An easy exclusion criterion is a matrix that is not nxn. Only a square matrices are invertible (have an inverse). For the matrix to be invertible, the vectors (as columns) must be linearly independent. In other words, you have to check that for an nxn matrix given by {v1 v2 v3 &acirc;&euro;&cent;&acirc;&euro;&cent;&acirc;&euro;&cent; vn} with n vectors with n components, there are not constants (a, b, c, etc) not all zero such that av1 + bv2 + cv3 + &acirc;&euro;&cent;&acirc;&euro;&cent;&acirc;&euro;&cent; + kvn = 0 (meaning only the trivial solution of a=b=c=k=0 works).So all you're doing is making sure that the vectors of your matrix are linearly independent. The matrix is invertible if and only if the vectors are linearly independent. Making sure the only solution is the trivial case can be quite involved, and you don't want to do this for large matrices. Therefore, an alternative method is to just make sure the determinant is not 0. Remember that the vectors of a matrix "A" are linearly independent if and only if detA&acirc;&permil;?0, and by the same token, a matrix "A" is invertible if and only if detA&acirc;&permil;?0.


What is rational dimension?

Rational dimension refers to the dimension of a vector space over the field of rational numbers. It is the minimum number of linearly independent vectors needed to span the entire vector space. The rational dimension can differ from the ordinary dimension of a vector space if the vectors are over a field other than the rational numbers.


1 1 Check whether the following set of vectors is LD or LI?

(i) They are linearly dependent since the 2nd vector is twice the 1st vector. All 3 vectors lie in the x-z plane, so they don't span 3D space. (ii) They are linearly independent. Note that the cross-product of the first two is (-1,1,1). If the third vector is not perpendicular to the above cross-product, then the third vector does not lie in the plane defined by the first two vectors. (-1,1,1) "dot" (1,1,-1) = -1+1-1 = -1, not zero, so 3rd vector is not perpendicular to the cross product of the other two.


What is coplanar vector?

In geometry a vector is used to make the equations easier to understand and to figure out. Velocity and force are examples of vectors. For a vector to be coplanar there must be two or more and they must be linearly dependent. Coplanar vectors have proportional components and their rank is 2.


What is the condition for being 3 vectors in a plane?

The general idea is that 3 vectors are in a plane iff they are not linearly independent. This can be checked in several ways:guessing a way to represent one of them as a linear combination of the other two - if it can be done, then they are coplanar;if they are three-dimensional, simply by calculating the determinant of the matrix whose columns are the vectors - if it's zero, they are coplanar, otherwise, they aren't;otherwise, you may calculate the determinant of their gramian matrix, that is, a matrix whose ij-th entry is the dot product if the i-th and j-th of the three vectors (e.g. it's 1-2-nd entry would be the dot product of first and second of them); they are coplanar iff the determinant is zero.


What is the difference between a scalar quantity and a vector?

Scalars are quantities that have magnitude only; they are independent of direction. Vectors have both magnitude and direction. vectors need bold letters to show them.


What is an independent system of linear equations?

An independent system of linear equations is a set of vectors in Rm, where any other vector in Rm can be written as a linear combination of all of the vectors in the set. The vector equation and the matrix equation can only have the trivial solution (x=0).


How do you do dot product of the vectors at any dimension?

You do the dot product of the vectors by multiplying their corresponding coordinates and adding them up altogether. For instance: &lt;1,2,3&gt; &#8729; &lt;-3,4,-1&gt; = 1(-3) + 2(4) + 3(-1) = -3 + 8 - 3 = 2


Can a vector space have exactly two distinct vectors in it?

No.A vector space is a set over a field that has to satisfy certain rules, called axioms. The field in question can be Z2 (see discussion), but unlike a field, a vector's inverse is distinct from the vector. Therefore, in order to satisfy the "inverse elements of addition" axiom for vector spaces, a vector space must minimally (except if it is the null space) have three vectors, v, 0, and v-1. The null space only has one vector, 0.Field's can allow for two distinct elements, unlike vector spaces, because for any given element of a field, for example a, a + (-a) = 0 meets the inverse axiom, but a and -a aren't required to be distinct. They are simply scalar magnitudes, unlike vectors which can often be thought of as having a direction attached to them. That's why the vectors, v and -v are distinct, because they're pointing in opposite directions.