Eigenvalues and eigenvectors of a rank-1 Matrix
I discuss a nice linear algebra problem that I have encountered recently.
Consider a matrix $M$ constructed as the outer product of a column vector $v$:
\[\begin{equation} v = [v_1, v_2, \dots, v_n]^T, \quad\quad M = v\, v^T. \end{equation}\]The resulting matrix $M$ is a symmetric $n \times n$:
\[\begin{equation}\label{M} M = \begin{bmatrix} v_{1}^2 & v_1 v_2 & \dots & v_1 v_n \\ v_2 v_1 & v_2^2 & \dots & v_2 v_n \\ \vdots & & \ddots & \vdots\\ v_n v_1 & \dots & \dots & v_{n}^2 \end{bmatrix}, \end{equation}\]Notice that the columns of $M$ can be written as
\[v_i \begin{bmatrix}v_1 \\ \vdots \\ v_n\end{bmatrix}\]where $v_i$ are the components of $v$ for $i = 1,\dots n$. This implies that, we can always scale a column by a factor of $v_j / v_i$ to obtain another. Therefore, all columns are linearly dependent and $M$ is a rank 1 matrix. On the other hand, since $M$ is symmetric $n\times n$, it is diagonalizable with all eigenvalues on the diagonal:
\[\begin{equation} Q M Q^{T} = M_D = \begin{bmatrix} \lambda_{1} & 0 & \dots & 0 \\ 0 & \lambda_2 & \dots & 0\\ \vdots & & \ddots & \vdots\\ 0 & \dots & \dots & \lambda_{n} \end{bmatrix} \end{equation}\]where $Q$ is the orthogonal matrix $Q^T = Q^{-1}$ whose columns are the corresponding eigenvectors of $M$ with unit norm. Using the properties of determinant and trace operation on this relation, we have
\[\begin{equation}\label{det} \textrm{det}(M_D) = \textrm{det}(M) = \prod_{i = 1}^n \lambda_i = 0 , \quad\quad \textrm{tr}(M) = \textrm{trace}(M_D) = \sum_{i = 1}^{n} \lambda_i = v^T v = |v|^2 \end{equation}\]Since $M$ is rank 1, the only non-zero eigenvalue will be $\lambda = v^T v$, while the other $n-1$ eigenvalues will be zero consistent with \eqref{det}. Finally, notice that
\[\begin{equation} M v = |v|^2 v \end{equation}\]so that the vector $v$ itself is an eigenvector corresponding to $\lambda = v^T v$. On the other hand, for the zero eigenvalue (with multiplicity $n -1$) any vector $x$ that is orthogonal to $v$ is an eigenvector
\[\begin{equation} M x = v (v^T x) = 0 \longrightarrow v^T x = 0 \longrightarrow x \perp v \end{equation}\]For a vector $v$ in $\mathbb{R}^n$, there are $n-1$ such vectors.
The properties we discussed for the special rank one matrix $M$ \eqref{M} we identified above can in fact be generalized to any rank-1 matrix. Similar to the special case we focused, any rank-1 matrix have columns that are scalar multiples of each other. A matrix with such a property can be generically described as an outer product of two non-zero column vectors $v, w \in \mathbb{R}^n$:
\[\begin{equation}\label{Mg} M = v \, w^T = \begin{bmatrix}v_1 \\ \vdots \\ v_n\end{bmatrix} \begin{bmatrix} w_1 \dots w_n \end{bmatrix} = \begin{bmatrix} v_{1}\,w_1 & v_1 w_2 & \dots & v_1 w_n \\ v_2 w_1 & v_2 w_2 & \dots & v_2 w_n \\ \vdots & & \ddots & \vdots\\ v_n w_1 & \dots & \dots & v_{n}\, w_n \end{bmatrix}. \end{equation}\]A rank-1 $n \times n$ matrix is singular $\delta M = 0$ and have $n-1$ linearly dependent columns and therefore has an eigenvalue $\lambda = 0$ with multiplicity $n-1$. The corresponding eigenvectors satisfy $M x = v (w^T x) = 0$ and therefore belong to the $n-1$ dimensional subspace orthogonal to $w \in \mathbb{R}^n$. For the remaining non-zero eigenvalue, we write $M x = v (w^T x) = \lambda x $ which implies that $x$ is scalar multiple of $v$:
\[x = \frac{(w^T x)}{\lambda} v\]and therefore $v$ is an eigenvector itself: $M v = v (w^T v) = \lambda v$ with the corresponding eigenvalue $\lambda = w^T v$. This is a scalar quantity and in fact gives the trace of the matrix $M$ \eqref{Mg}:
\[\lambda = w^T v = \sum_{i = 1}^n v_i \, w_i = \textrm{tr}(M)\]These facts let us conclude that a rank-1 $n \times n$ matrix $M$ has two eigenvalues $\lambda = 0$ (with multiplicity $n-1$) and $\lambda = \textrm{tr}(M)$.