Subsection C.1.3 Matrix-vector operations
ΒΆThe following "matrix-vector" operations play an important role.
-
Matrix-vector multiplication: \(y:= \alpha A x + \beta y \text{,}\) where \(A \in \R^{m \times n}\text{.}\)
Let's ignore \(\beta \) because it almost always equals \(1 \text{.}\) If not, we can start by scaling \(y \text{.}\) Next, we observe that
\begin{equation*} \alpha A x + y = \alpha \left( \begin{array}{c} \widetilde a_0^T \\ \hline \vdots \\ \hline \widetilde a_{m-1}^T \end{array} \right) x + y = \left( \begin{array}{c} \alpha (\widetilde a_0^T x) + \psi_0\\ \hline \vdots \\ \hline \alpha (\widetilde a_{m-1}^T x) + \psi_{m-1} \end{array} \right). \end{equation*}We notice that the cost equals, roughly, \(m \) dot products with vectors of size \(n \) for a total of (approximately) \(2 m n \) flops.
-
Rank-1 update: \(A:= \alpha x y^T + A \text{,}\) where \(A \in \R^{m \times n}\text{.}\)
Notice that
\begin{equation*} \begin{array}{rcl} \alpha x y^T + A \amp = \amp \alpha x \left( \begin{array}{c | c | c} \psi_0 \amp \cdots \amp \psi_{n-1} \end{array} \right) + \left( \begin{array}{c | c | c} a_0 \amp \cdots \amp a_{n-1} \end{array} \right) \\ \amp = \amp \left( \begin{array}{c | c | c} ( \alpha \psi_0 ) x + a_0 \amp \cdots \amp ( \alpha \psi_{n-1} ) x + a_{n-1} \end{array} \right). \end{array} \end{equation*}We notice that the cost equals, roughly, \(n \) axpy operations products with vectors of size \(m \) for a total of (approximately) \(2 m n \) flops.
Matrix-vector operations with an \(m \times n \) matrix require \(O( mn ) \) flops.