answersLogoWhite

0

Why you calculate eigenvalues and eigenvectors?

Updated: 12/15/2022
User Avatar

Pioneer0111

Lvl 1
9y ago

Best Answer

There is a great series of videos on YouTube about quantum mechanics (which is one place where such concepts are used a lot). For the "why", the author says: "Because it works". In other words, it has been found that doing the calculations a certain way provides results that make sense, and that are consistent with observations.

Of course - as the same author points out - it took a genius to figure this out.

User Avatar

Wiki User

9y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Why you calculate eigenvalues and eigenvectors?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

Do similar matrices have the same eigenvectors?

No, in general they do not. They have the same eigenvalues but not the same eigenvectors.


Can a Hermitian Matrix possess Complex Eigenvectors?

Yes. Simple example: a=(1 i) (-i 1) The eigenvalues of the Hermitean matrix a are 0 and 2 and the corresponding eigenvectors are (i -1) and (i 1). A Hermitean matrix always has real eigenvalues, but it can have complex eigenvectors.


What is Eigen analysis?

Eigenvalues and eigenvectors are properties of a mathematical matrix.See related Wikipedia link for more details on what they are and some examples of how to use them for analysis.


What are eigenvalues and eigenvectors?

An eigenvector is a vector which, when transformed by a given matrix, is merely multiplied by a scalar constant; its direction isn't changed. An eigenvalue, in this context, is the factor by which the eigenvector is multiplied when transformed.


What is the eigen value?

This is the definition of eigenvectors and eigenvalues according to Wikipedia:Specifically, a non-zero column vector v is a (right) eigenvector of a matrix A if (and only if) there exists a number λ such that Av = λv. The number λ is called the eigenvalue corresponding to that vector. The set of all eigenvectors of a matrix, each paired with its corresponding eigenvalue, is called the eigensystemof that matrix


Do similar matrices have the same eigenvalues?

Yes, similar matrices have the same eigenvalues.


What are the applications of Eigenvalue and Eigen vector in computer science?

Eigenvectors and eigenvalues are important for understanding the properties of expander graphs, which I understand to have several applications in computer science (such as derandomizing random algorithms). They also give rise to a graph partitioning algorithm. Perhaps the most famous application, however, is to Google's PageRank algorithm.


What do you mean by eigen value and eigen function?

In linear algebra, there is an operation that you can do to a matrix called a linear transformation that will get you answers called eigenvalues and eigenvectors. They are to complicated to explain in this forum assuming that you haven't studied them yet, but their usefulness is everywhere in science and math, specifically quantum mechanics. By finding the eigenvalues to certain equations, one can come up with the energy levels of hydrogen, or the possible spins of an electron. You really need to be familiar with matrices, algebra, and calculus though before you start dabbling in linear algebra.


What has the author Jan R Magnus written?

Jan R. Magnus has written: 'Linear structures' -- subject(s): Matrices 'The bias of forecasts from a first-order autoregression' 'The exact multiperiod mean-square forecast error for the first-order autoregressive model with an intercept' 'On differentiating Eigenvalues and Eigenvectors' 'The exact moments of a ratio of quadratic forms in normal variables' 'Symmetry, 0-1 matrices, and Jacobians'


Do commutative matrices have the same eigenvectors?

It is true that diagonalizable matrices A and B commute if and only if they are simultaneously diagonalizable. This result can be found in standard texts (e.g. Horn and Johnson, Matrix Analysis, 1999, Theorem 1.3.12.) One direction of the if and only if proof is straightforward, but the other direction is more technical: If A and B are diagonalizable matrices of the same order, and have the same eigenvectors, then, without loss of generality, we can write their diagonalizations as A = VDV-1 and B = VLV-1, where V is the matrix composed of the basis eigenvectors of A and B, and D and L are diagonal matrices with the corresponding eigenvalues of A and B as their diagonal elements. Since diagonal matrices commute, DL = LD. So, AB = VDV-1VLV-1 = VDLV-1 = VLDV-1 = VLV-1VDV-1 = BA. The reverse is harder to prove, but one online proof is given below as a related link. The proof in Horn and Johnson is clear and concise. Consider the particular case that B is the identity, I. If A = VDV-1 is a diagonalization of A, then I = VIV-1 is a diagonalization of I; i.e., A and I have the same eigenvectors.


How do you find eigenvalues of a 3 by 3 matrix?

Call your matrix A, the eigenvalues are defined as the numbers e for which a nonzero vector v exists such that Av = ev. This is equivalent to requiring (A-eI)v=0 to have a non zero solution v, where I is the identity matrix of the same dimensions as A. A matrix A-eI with this property is called singular and has a zero determinant. The determinant of A-eI is a polynomial in e, which has the eigenvalues of A as roots. Often setting this polynomial to zero and solving for e is the easiest way to compute the eigenvalues of A.


How do you determine transfer function of a synchronous generator?

This is just one of the ways:Choose the variables couple in question defining the SISO form of the system. Write out the state space matrix commonly denoted as "A" of the synchronous machine. Calculate the eigenvalues of that matrix. Then calculate the residues of the matrix with respect to the selected SISO system (the chosen variables in question define the input matrix B and output matrix C). The eigenvalues are the zeros of the transfer function while the residues are the constants in the fractionally partitioned form of the transfer function.The matrices I was talking about define the linearised system in the form :dx/dt=Ax+Buy=CxFor a more thorough explanation seePower System Stability And Control By Prabha Kundur