Matrices as collections of vectors
Trace, scalar product
Some special matrices
Matrices can be viewed simply as a collection of vectors of same size, that is, as a collection of points in a high-dimensional space.
Matrices can be described in column-wise fashion: given vectors in , we can define the matrix with ’s as columns: Geometrically, represents points in a -dimensional space.
The notation denotes the element of sitting in row and column . The transpose of a matrix . denoted by , is the matrix with element , , .
Similarly, we can describe a matrix in row-wise fashion: given vectors in , we can define the matrix with the transposed vectors as rows: Geometrically, represents points in a -dimensional space.
The notation denotes the set of matrices.
We define the matrix-vector product between a matrix and a -vector , and denote by , the -vector with -th component
If the columns of are given by the vectors , , so that , then can be interpreted as a linear combination of these columns, with weights given by the vector :
Alternatively, if the rows of are the row vectors , : then is the vector with elements , :
Return to the network example, involving a incidence matrix. We note that, by construction, the columns of sum to zero, which can be compactly written as , or .
We can extend matrix-vector product to matrix-matrix product, as follows. If and , the notation denotes the matrix with element given by It can be shown that transposing a product changes the order, so that .
If the columns of are given by the vectors , , so that , then can be written as In other words, results from transforming each column of into .
The matrix-matrix product can also be interpreted as an operation on the rows of . Indeed, if is given by its rows , , then is the matrix obtained by transforming each one of these rows via , into , : (Note that ’s are indeed row vectors, according to our matrix-vector rules.)
Matrix algebra generalizes to blocks, provided block sizes are consistent. To illustrate this, consider the matrix-vector product between a matrix and a -vector , where are partitioned in blocks, as follows: where is , , , . Then Likewise, if a matrix is partitioned into two blocks , each of size , , with , then
Example: Gram matrix.
The trace of a square matrix , denoted by , is the sum of its diagonal elements: .
We can define the scalar product between two matrices via We can interpret the above scalar product as the (vector) scalar product between two long vectors of length each, obtained by stacking all the columns of on top of each other.
Important classes of matrices include the following.
The identity matrix (often denoted , or simply , if context allows), has ones on its diagonal and zeros elsewhere. It is diagonal, symmetric, and orthogonal, and satisfies for every matrix with columns.
Square matrices are matrices that have the same number of rows as columns.
Diagonal matrices are square matrices with when .
Symmetric matrices are square matrices that satisfy for every pair . An entire topic is devoted to symmetric matrices.
Orthogonal matrices are square matrices, such that the columns form an orthonormal basis. If is an orthogonal matrix, then Thus, . Similarly, .
Orthogonal matrices correspond to bases that are a rotation of the standard basis. Their effect on a vector is to rotate it, leaving its length (Euclidean norm) invariant: for every vector ,
Example:A orthogonal matrix.