Oh, dude, an eigenvector is like a fancy term in math for a vector that doesn't change direction when a linear transformation is applied to it. It's basically a vector that just chills out and stays the same way, no matter what you do to it. So, yeah, eigenvectors are like the cool, laid-back dudes of the math world.
Chat with our AI personalities
Given some matrix A, an eigenvector of A is a vector that, when acted on by A, will result in a scalar multiple of itself, i.e. Ax=[lambda]x, where lambda is a real scalar multiple, called an eigenvalue, and x is the eigenvector described.
To find x you will normally have to find lambda first, which means solving the "characteristic equation": det(A-[lambda]I)=0, where I is the identity matrix.
The derivation of the "characteristic equation" is as follows:
Rearrange the equation Ax=[lambda]x -> Ax-[lambda]x=0 -> (A-[lambda]I)x=0 and then use the property from linear algebra that says if (A-[lambda]x) has an inverse, then x=0. Since this is trivial, we must instead prove that (A-[lambda]x) does not have an inverse. Because the inverse of a matrix is equal to its transpose divided by its determinant, and because you can't divide by 0, a 0 valued determinant means that the inverse can't exist. This is why we must solve det(A-[lambda]I)=0 for lambda.
Once we have found lambda, we can put it in the equation Ax=[lambda]x, and it's then just a simple matter of solving the resulting linear equations.
The eigen values of a matirx are the values L such that Ax = Lxwhere A is a matrix, x is a vector, and L is a constant.The vector x is known as the eigenvector.
The eigen values of a matirx are the values L such that Ax = Lxwhere A is a matrix, x is a vector, and L is a constant.The vector x is known as the eigenvector.