Matrices:
The rotation matrices are only one example of quantities with two indices which are called matrices by mathematicians.
It is possible to define calculation rules generally for
matrices,
i.e. schemes with
lines and
columns and to examine their structures. We want to restrict our
considerations to quadratic
-matrices
and even more specifically to
-matrices
with real elements.
We denote the matrices by underlined capital letters, e.g.
.
Their elements
carry two indices: the left one
denotes
the (horizontal) line and the right one
the (vertical) column of the matrix:
Some kinds of matrices have special names because of their importance:
In particular, diagonal matrices are of special importance, having only the three elements
,
and
along
the so-called main diagonal (:from the left on top downward to the
right) different from
.
The second diagonal (:from the right up downward to
the left) is, in comparison, much less important.
The matrices of rotations by a multiple of
are
examples of diagonal matrices:
,
and
.
Half the way to the diagonal structure the triangle form is worth mentioning, which has only zeros either above or below the main diagonal:
Also matrices in box form are especially convenient for
many purposes. In these matrices non-zero elements are only in "boxes" around the main diagonal.
Our rotation matrices
and
are
of this kind.
A simple operation that can be carried out with every matrix is
transposition: This means the reflection of all matrix elements through the main diagonal, or
in other words, the exchange of lines and
columns:
There exist matrices for which transposition does not change anything: They are called symmetric.
These symmetric matrices occur very often in physics and have the advantage that they can be brought to diagonal form by certain simple transformations.
As you see immediately, each symmetric matrix has only six independent elements.
If the reflection through the main diagonal leads to a minus sign, the matrix is called antisymmetric:
Of course the diagonal elements have to vanish in this case. Apparently an antisymmetric
-matrix
has only three independent elements. That is the deeper reason for the
existence of a vector product in three dimensions, as we will soon see in more
detail.
Finally we mention a special quantity of every matrix: The sum of the elements along the main diagonal is called the trace (:in German "Spur") of the matrix:
Much more
important for physics is however the multiplication of two
-matrices
which corresponds in the case of transformation matrices to two
transformations of the coordinate system carried out one after the other:
The following multiplicative instruction holds:
matrix multiplication:
|
In the last part above the summation symbol is omitted according to the Einstein summation convention (s. Section 9.2.3.2), since the two identical indices signalize the summation well enough.
To calculate the product matrix element
in
the z-th line and the s-th column you may imagine the s-th (vertical) column
of the factor matrix
on
the right side put horizontally upon the z-th line
of the left factor matrix
,
elements on top of each other multiplied and the three products added: e.g.
,
thus altogether:
Exercise 9.10: Matrix multiplication Multiply the following transformation matrices:
|
The most important discovery to be made by working throegh the Exercise 9.10 is the fact that generally no commutative law holds for rotations, and consequently not for the representing matrices. You can easily check this visually with every match box as is illustrated in the following Figure:
Figure 9.10: Match box, first rotated by
around
the 3-axis and then by
around
the 1-axis, compared with a box which is first rotated around the 1-axis and
afterwards around the 3-axis.
The examples from Exercise 9.10 have already shown to you that in some
exceptional cases the commutative law nevertheless
holds: all rotations around one and the same axis are for instance
commutable. Also all diagonal matrices are commutable with each other. This
is the reason for their popularity. If
,
the so-called commutation relation
promises to be an interesting quantity. This will acquire great significance in
quantum mechanics later on.
Apart from commutability, matrix multiplication behaves as expected: There holds an
Exercise 9.11: Associative Law for matrix multiplication Verify the Associative Law for the Euler rotation:
|
Only with the
For our transformation
matrices, however, this constraint is unimportant. For these matrices
the inverse is simply the transposed one
which
exists in every case as we have seen: Mathematicians call these matrices orthogonal (s. Section 9.8.2)
and we will inspect these carefully later on: