Systems of linear equations and matrices: Matrices
Multiplication of matrices
Matrix multiplication is somewhat more complicated than matrix addition because it does not work component-wise, and because the size and the order of the matrices in a matrix multiplication matter.
The matrix product
We define the product \(A\,B\) of two matrices \(A\) and \(B\) only if every row of \(A\) is as long as each column \(B\). So if \(A\) is a \((m\times n)\)-matrix, then \(B\) needs to be a \((n\times p)\)-matrix for certain \(p\). If this is the case, then the matrix product \(A\,B\) is a \((m\times p)\)-matrix. The element \(c_{ij}\) on the \(i\)-th row and the \(j\)-th column in the matrix product \( C=A\,B\) is defined as follows:
\[
c_{ij}=a_{i1}b_{1j}+a_{i2}b_{2j}+\cdots +a_{in}b_{nj} \quad\text{for }\quad i=1,\ldots, m; \, j=1,\ldots, p
\]
We can also write the right hand side of the definition of \(c_{ij}\) as the dot product of the #i#-th row vector of #A# and #j#-th column vector of #B# (interpreted as column vectors) or the matrix product of the #i#-th row vector of #A# and #j#-th column vector of #B# (interpreted as matrices): \[c_{ij} = \cv{a_{i1}\\ \vdots\\ a_{in}}\boldsymbol{\cdot}\cv{b_{1j}\\ \vdots\\ b_{nj}}= \matrix{a_{i1}& \cdots & a_{in}}\matrix{b_{1j}\\ \vdots\\ b_{nj}}\]
The product of the matrices #A# and #B# can be visualized as follows: \[\text{If}\quad A=\matrix{a_{11} & \cdots & a_{1n}\\ \vdots & \ddots & \vdots \\ a_{m1} & \cdots & a_{mn}}\quad\text{and}\quad B=\matrix{b_{11} & \cdots & a_{1p}\\ \vdots & \ddots & \vdots \\ b_{n1} & \cdots & b_{np}}\] then \(C=AB\) is the \((m\times p)\)-matrix of which the element \(c_{ij}\) is the dot product of the magenta colored row and column of the matrices #A# and #B#: \[\matrix{a_{11} & a_{12} &\cdots & a_{1n}\\ a_{21} & a_{22} & \cdots & a_{2n} \\ \vdots & \vdots & \ddots & \vdots\\ \color{magenta}{a_{i1}} & \color{magenta}{a_{i2}} & \color{magenta}{\cdots} & \color{magenta}{a_{in}}\\ \vdots & \vdots & \ddots & \vdots \\ a_{m1} & a_{m2} & \cdots & a_{mn}} \matrix{b_{11} & b_{12} & \cdots & \color{magenta}{b_{1j}} & \cdots & b_{1p} \\ b_{21} & b_{22} & \cdots & \color{magenta}{b_{2j}} & \cdots & b_{2p}\\ \vdots & \vdots & \ddots & \color{magenta}{\vdots} & \ddots & \vdots \\ b_{n1} & b_{n2} & \cdots & \color{magenta}{b_{nj}} & \cdots & b_{np}} = \matrix{ c_{11} & \cdots & c_{1p} \\ \vdots & \color{magenta}{c_{ij}} & \vdots \\ c_{m1} & \cdots & c_{mp}} \]
Use as many examples as you need to become familiar with the product of matrices.
\[\begin{aligned}
\left[\matrix{\color{magenta}{5}&\color{magenta}{-3}\\4&3}\cdot\matrix{-2&\color{magenta}{1}\\0&\color{magenta}{5}}\right]_{12} &=\matrix{\color{magenta}{5}\\\color{magenta}{-3}}\boldsymbol{\cdot}\matrix{\color{magenta}{1}\\\color{magenta}{5}}\\ \\
&= 5 \cdot 1 -3 \cdot 5\\\\ &=-8\tiny{.}
\end{aligned}\] In the same way, the other matrix coefficients can be calculated.
\[\begin{aligned}
\matrix{5&-3\\4&3}\cdot \matrix{-2&1\\0&5}&=\matrix{5 \cdot -2 -3\cdot 0 & 5 \cdot 1 -3 \cdot 5\\4\cdot -2+3\cdot 0&4\cdot 1 + 3 \cdot 5}\\ \\
&=\matrix{-10&-10\\-8&19}\tiny{.}
\end{aligned}\]
\[
\begin{array}{rcl}
A\,(B+C)&=& A\,B+A\,C \\
(\lambda A)\,B&=&\lambda (A\,B) \\
(A\,B)\,C&=&A\,(B\,C) \\
(A\,B)^{\top}&=&B^{\top}A^{\top} \end{array} \]
Thanks to the second rule we can write \(\lambda\, A\,B\) without parentheses; it does not matter if we calculate this expression as \((\lambda\, A)\,B\) or as \(\lambda\,(A\,B)\). Similarly, the third line enables us to write \(A\,B\,C\); the order in which products are calculated matrix does not matter.
Or visit omptest.org if jou are taking an OMPT exam.