In mathematics, a binary operation is commutative if changing the order of the operands does not change the result. It is a fundamental property of many binary operations, and many mathematical proofs depend on it. Perhaps most familiar as a property of arithmetic, e.g. "3 + 4 = 4 + 3" or "2 × 5 = 5 × 2", the property can also be used in more advanced settings. The name is needed because there are operations, such as division and subtraction, that do not have it (for example, "3 − 5 ≠ 5 − 3"); such operations are not commutative, and so are referred to as noncommutative operations. The idea that simple operations, such as the multiplication and addition of numbers, are commutative was for many years implicitly assumed. Thus, this property was not named until the 19th century, when mathematics started to become formalized.[1][2] A similar property exists for binary relations; a binary relation is said to be symmetric if the relation applies regardless of the order of its operands; for example, equality is symmetric as two equal mathematical objects are equal regardless of their order.[3]
A binary operation on a set S is called commutative if[4][5] In other words, an operation is commutative if every two elements commute. An operation that does not satisfy the above property is called noncommutative.
One says that x commutes with y or that x and y commute under ifThat is, a specific pair of elements may commute even if the operation is (strictly) noncommutative.
Division is noncommutative, since .
Subtraction is noncommutative, since . However it is classified more precisely as anti-commutative, since .
Exponentiation is noncommutative, since . This property leads to two different "inverse" operations of exponentiation (namely, the nth-root operation and the logarithm operation), whereas multiplication only has one inverse operation.[6]
Some truth functions are noncommutative, since the truth tables for the functions are different when one changes the order of the operands. For example, the truth tables for (A ⇒ B) = (¬A ∨ B) and (B ⇒ A) = (A ∨ ¬B) are
Function composition of linear functions from the real numbers to the real numbers is almost always noncommutative. For example, let and . ThenandThis also applies more generally for linear and affine transformations from a vector space to itself (see below for the Matrix representation).
Matrix multiplication of square matrices is almost always noncommutative, for example:
The vector product (or cross product) of two vectors in three dimensions is anti-commutative; i.e., b × a = −(a × b).
Records of the implicit use of the commutative property go back to ancient times. The Egyptians used the commutative property of multiplication to simplify computing products.[7][8] Euclid is known to have assumed the commutative property of multiplication in his book Elements.[9] Formal uses of the commutative property arose in the late 18th and early 19th centuries, when mathematicians began to work on a theory of functions. Today the commutative property is a well-known and basic property used in most branches of mathematics.
The first recorded use of the term commutative was in a memoir by François Servois in 1814,[1][10] which used the word commutatives when describing functions that have what is now called the commutative property. Commutative is the feminine form of the French adjective commutatif, which is derived from the French noun commutation and the French verb commuter, meaning "to exchange" or "to switch", a cognate of to commute. The term then appeared in English in 1838.[2] in Duncan Gregory's article entitled "On the real nature of symbolical algebra" published in 1840 in the Transactions of the Royal Society of Edinburgh.[11]
In truth-functional propositional logic, commutation,[12][13] or commutativity[14] refer to two valid rules of replacement. The rules allow one to transpose propositional variables within logical expressions in logical proofs. The rules are:and where "" is a metalogical symbol representing "can be replaced in a proof with".
Commutativity is a property of some logical connectives of truth functional propositional logic. The following logical equivalences demonstrate that commutativity is a property of particular connectives. The following are truth-functional tautologies.
In group and set theory, many algebraic structures are called commutative when certain operands satisfy the commutative property. In higher branches of mathematics, such as analysis and linear algebra the commutativity of well-known operations (such as addition and multiplication on real and complex numbers) is often used (or implicitly assumed) in proofs.[15][16][17]
The associative property is closely related to the commutative property. The associative property of an expression containing two or more occurrences of the same operator states that the order operations are performed in does not affect the final result, as long as the order of terms does not change. In contrast, the commutative property states that the order of the terms does not affect the final result.
Most commutative operations encountered in practice are also associative. However, commutativity does not imply associativity. A counterexample is the functionwhich is clearly commutative (interchanging x and y does not affect the result), but it is not associative (since, for example, but ). More such examples may be found in commutative non-associative magmas. Furthermore, associativity does not imply commutativity either – for example multiplication of quaternions or of matrices is always associative but not always commutative.
Some forms of symmetry can be directly linked to commutativity. When a commutative operation is written as a binary function then this function is called a symmetric function, and its graph in three-dimensional space is symmetric across the plane . For example, if the function f is defined as then is a symmetric function.
For relations, a symmetric relation is analogous to a commutative operation, in that if a relation R is symmetric, then .
In quantum mechanics as formulated by Schrödinger, physical variables are represented by linear operators such as (meaning multiply by ), and . These two operators do not commute as may be seen by considering the effect of their compositions and (also called products of operators) on a one-dimensional wave function :
According to the uncertainty principle of Heisenberg, if the two operators representing a pair of variables do not commute, then that pair of variables are mutually complementary, which means they cannot be simultaneously measured or known precisely. For example, the position and the linear momentum in the -direction of a particle are represented by the operators and , respectively (where is the reduced Planck constant). This is the same example except for the constant , so again the operators do not commute and the physical meaning is that the position and linear momentum in a given direction are complementary.