It is true that diagonalizable matrices A and B commute if and only if they are simultaneously diagonalizable. This result can be found in standard texts (e.g. Horn and Johnson, Matrix Analysis, 1999, Theorem 1.3.12.)
One direction of the if and only if proof is straightforward, but the other direction is more technical:
If A and B are diagonalizable matrices of the same order, and have the same eigenvectors, then, without loss of generality, we can write their diagonalizations as A = VDV-1 and B = VLV-1, where V is the matrix composed of the basis eigenvectors of A and B, and D and L are diagonal matrices with the corresponding eigenvalues of A and B as their diagonal elements. Since diagonal matrices commute, DL = LD. So, AB = VDV-1VLV-1 = VDLV-1 = VLDV-1 = VLV-1VDV-1 = BA.
The reverse is harder to prove, but one online proof is given below as a related link. The proof in Horn and Johnson is clear and concise.
Consider the particular case that B is the identity, I. If A = VDV-1 is a diagonalization of A, then I = VIV-1 is a diagonalization of I; i.e., A and I have the same eigenvectors.
Chat with our AI personalities
No. Multiplication of matrices is, in general, non-commutative, due to the way multiplication is defined.
Sometimes . . A+
The commutative property works for adding and multiplying e.g. 2+4=4+2 and 3x4=4x3. But it doesn't work for subtraction and division so 5-3≠3-5 and 6÷2≠2÷6 so subtraction and division could be considered as exceptions.
It is so too equal! Multiplication is commutative. Unless A and B are matrices. Matrix multiplication is NOT commutative. Whether or not AxB = BxA depends upon the definition of the binary operator x [multiply] in the domain over which it is defined.
The answer depends on the context. For example, multiplication of numbers is commutative (A*B = B*A) but multiplication of matrices is not.