It is true that diagonalizable matrices A and B commute if and only if they are simultaneously diagonalizable. This result can be found in standard texts (e.g. Horn and Johnson, Matrix Analysis, 1999, Theorem 1.3.12.)
One direction of the if and only if proof is straightforward, but the other direction is more technical:
If A and B are diagonalizable matrices of the same order, and have the same eigenvectors, then, without loss of generality, we can write their diagonalizations as A = VDV-1 and B = VLV-1, where V is the matrix composed of the basis eigenvectors of A and B, and D and L are diagonal matrices with the corresponding eigenvalues of A and B as their diagonal elements. Since diagonal matrices commute, DL = LD. So, AB = VDV-1VLV-1 = VDLV-1 = VLDV-1 = VLV-1VDV-1 = BA.
The reverse is harder to prove, but one online proof is given below as a related link. The proof in Horn and Johnson is clear and concise.
Consider the particular case that B is the identity, I. If A = VDV-1 is a diagonalization of A, then I = VIV-1 is a diagonalization of I; i.e., A and I have the same eigenvectors.
No. Multiplication of matrices is, in general, non-commutative, due to the way multiplication is defined.
Sometimes . . A+
The commutative property works for adding and multiplying e.g. 2+4=4+2 and 3x4=4x3. But it doesn't work for subtraction and division so 5-3≠3-5 and 6÷2≠2÷6 so subtraction and division could be considered as exceptions.
It is so too equal! Multiplication is commutative. Unless A and B are matrices. Matrix multiplication is NOT commutative. Whether or not AxB = BxA depends upon the definition of the binary operator x [multiply] in the domain over which it is defined.
The answer depends on the context. For example, multiplication of numbers is commutative (A*B = B*A) but multiplication of matrices is not.
No, in general they do not. They have the same eigenvalues but not the same eigenvectors.
No. Multiplication of matrices is, in general, non-commutative, due to the way multiplication is defined.
Commutative Matrix If A and B are the two square matrices such that AB=BA, then A and B are called commutative matrix or simple commute.
Matrix addition is commutative if the elements in the matrices are themselves commutative.Matrix multiplication is not commutative.
Sometimes . . A+
Commuting in algebra is often used for matrices. Say you have two matrices, A and B. These two matrices are commutative if A * B = B * A. This rule can also be used in regular binary operations(addition and multiplication). For example, if you have an X and Y. These two numbers would be commutative if X + Y = Y + X. The case is the same for X * Y = Y * X. There are operations like subtraction and division that are not commutative. These are referred to as noncommutative operations. Hope this helps!!
Yes if they are elements of a commutative (Abelian) set, but not otherwise. So it would not work with matrices, for example.
Assuming you mean definition, commutative is a property of an operation such that the order of the operands does not affect the result. Thus for addition, A + B = B + A. Multiplication of numbers is also commutative but multiplication of matrices is not. Subtraction and division are not commutative.
Subtraction, division, cross multiplication of vectors, multiplication of matrices, etc.
Yes, similar matrices have the same eigenvalues.
Yes. Multiplication is commutative, just like addition.
The matrices must have the same dimensions.