2
$\begingroup$

I was reading about how the implicit function theorem may be used to express eigenvalues of real symmetric matrices as functionals of the matrix on a neighborhood. I got stuck in equation (8), page 183 here: https://www.janmagnus.nl/papers/JRM011.pdf

Suppose $X_0$ is a real symmetric matrix with $X_0u_0 = \lambda_0u_0$ and $u_0'u_0 = 1$.

This equation claims that the presented derivative $$det\begin{bmatrix} \lambda_0 I - X_0 & u_0\\ 2u_0' & 0 \end{bmatrix}\neq 0$$if and only if $\lambda_0$ is a simple eigenvalue of $X_0$ and proceeds to say also that the determinant equals $-2$ times the product of all the other eigenvalues of $X_0$ except $\lambda_0$.

I am not sure how the author is able to make all these claims. How does he even compute the determinant? The most natural thought that comes to mind is using the formula for determinant for block matrices, but $\lambda_0I - X_0$ is not even invertible!

More importantly, I am not sure how this determinant being non-zero is connected to $\lambda_0$ being a simple eigenvalue of $X_0$.

$\endgroup$

1 Answer 1

1
$\begingroup$

Notice that if $\lambda_0$ is not simple, then we can take another eigenvector corresponding to $\lambda_0$ and make it orthogonal to $u_0$. Call this vector $v$. Then we have $$ \begin{pmatrix}\lambda_0I-X_0 & u_0\\ 2u_0^T&0\end{pmatrix} \begin{pmatrix}v\\0\end{pmatrix} = \begin{pmatrix}\lambda_0v - X_0v + 0 \\ 2u_0^Tv + 0\end{pmatrix} = \begin{pmatrix}0\\0\end{pmatrix}, $$ So the matrix is no longer invertible. I'm not going to proof the iff statement, but this is why it matters.

To verify the claim about the value of the determinant, complete the eigenbasis of $X_0$ to obtain $\{u_0, v_1,\dots,v_{n-1}\}$ corresponding to eigenvalues $\lambda_0,\lambda_1,\dots\lambda_n$. Since $X_0$ is symmetric, $u_0\perp v_i$. We then have $$ \begin{pmatrix}\lambda_0I-X_0 & u_0\\ 2u_0^T&0\end{pmatrix} \begin{pmatrix}v_i\\0\end{pmatrix} = \begin{pmatrix}\lambda_0v_i - X_0v_i + 0 \\ 2u_0^Tv_i + 0\end{pmatrix} = (\lambda_0 - \lambda_i)\begin{pmatrix}v_i\\0\end{pmatrix}, $$ so we have that $\lambda_0-\lambda_1,\dots\lambda_0-\lambda_{n-1}$ are eigenvalues of the block matrix, and these are also the eigenvalues of $\lambda_0I-X_0$. Note that this is the matrix used in the claim in the linked proof, not $X_0$. This has given us $n-1$ of the eigenvalues of the block matrix.

To find the last two, notice that if we normalize $\|u_0\|=1$ , then the vectors $(u_0, \sqrt{2})^T$ and $(u_0,-\sqrt{2})^T$ are eigenvectors of the block matrix with eigenvalues $\sqrt{2}$ and $-\sqrt{2}$, respectively. Since the determinant is the product of the eigenvalues, we have that $\det A = \left(\prod_{i=1}^{n-1} \lambda_0-\lambda_i\right)(\sqrt{2})(-\sqrt{2})$, which is $-2$ times the product of the nonzero eigenvalues of $\lambda_0 I - X_0$, as desired.

$\endgroup$

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.