4
$\begingroup$

Let $A\in\mathbb{R}^{n\times n}$ be a symmetric matrix. Consider the bordered matrix $$ B(x, v) \;=\; \begin{bmatrix} A & v\\[2pt] v^{\top} & x \end{bmatrix}\in\mathbb{R}^{(n+1)\times(n+1)}. $$.

If I were to calculate the eigenvalues of $B(x, v)$ for many different values of $(x, v)$, could I reuse any shared results to save time and computing resources, possibly the eigendecomposition of $A$?

I saw this post that's closely related to my question, but it does not provide an answer. Also, it someone linked a paper about multiple-rank update, which I don't know why is related to the problem since the updated matrix is still $n \times n$ instead of $(n+1) \times (n + 1)$.

Any suggestions are appreciated!

Thank you.

$\endgroup$
3
  • $\begingroup$ You can directly consider $A= D$ diagonal with orthonormal change of basis with $O \oplus 1$ with $O^T AO = D$ then computing the characteristic polynomial is not hard $\endgroup$ Commented Sep 1 at 10:28
  • $\begingroup$ @julio_es_sui_glace won't $v$ interfere? $\endgroup$ Commented Sep 1 at 11:16
  • $\begingroup$ @user619894 Yes it plays a non-neglectible role in the computation of the eignevalues. $\endgroup$ Commented Sep 1 at 11:38

1 Answer 1

2
$\begingroup$

Partial answer

As said in the comments, suppose $U \in O_n(\mathbb R)$ such that $UAU^T = D = \operatorname{diag}(\lambda_1,\dots,\lambda_n)$. We set $$R= \begin{pmatrix} U & 0\\ 0 & 1 \end{pmatrix}$$ We then have $$RBR^T= \begin{pmatrix} D & Uv\\ (Uv)^T & x \end{pmatrix}$$ Since the characteristic polynomial does not depend on the basis, we set $w= Uv = (w_1,\dots,w_n)^T$, this gives us $$\begin{align*} \chi_{B} (T) &=\begin{vmatrix} T-\lambda_1 & 0 & \cdots & 0 & -w_1\\ 0 & \ddots & \ddots & \vdots & \vdots\\ \vdots & \ddots & \ddots & 0 & \vdots\\ 0& \cdots & 0 & T-\lambda_n & -w_n \\ -w_1 & \cdots & \cdots & -w_n & T-x \end{vmatrix} \\ &= (T-x) \chi_A (T) - \sum_{i=1}^n (-1)^{n+1+i-1} w_{i} \begin{vmatrix} T-\lambda_1 & 0 & \cdots & 0 & 0 & \cdots & 0 \\ 0 & \ddots & \ddots & \vdots & \vdots && \vdots\\ \vdots & \ddots & T-\lambda_{i-1}& 0 & 0 &\cdots& 0 \\ 0& \cdots & 0 & 0 & T-\lambda_{i+1} & \ddots & \vdots \\ \vdots &&\vdots&\vdots&\ddots&\ddots & 0\\ 0 &\cdots&0&0&\cdots& 0& T - \lambda_n\\ w_1 & \cdots & w_{i-1} & w_i & w_{i+1} & \cdots & w_{n} \end{vmatrix} \\ & = (T-x) \chi_A (T) - \sum_{i=1}^n (-1)^{n+i+n+i-1} w_{i}^2 \begin{vmatrix} T-\lambda_1 & 0 & \cdots & 0 & \cdots & 0 \\ 0 & \ddots & \ddots & \vdots && \vdots\\ \vdots & \ddots & T-\lambda_{i-1}& 0 &\cdots& 0\\ 0& \cdots & 0 & T-\lambda_{i+1} & \ddots & \vdots \\ \vdots &&\vdots&\ddots&\ddots & 0\\ 0 &\cdots&0&\cdots& 0& T - \lambda_n \end{vmatrix} \\ &= (T-x) \chi_A (T) +\sum_{i=1}^n w_{i}^2 \frac{\chi_A(T)}{T-\lambda_i}\\ &= \chi_A(T) \left( T-x + \sum_{i=1}^n \frac{w_i^2}{T-\lambda_i} \right) \end{align*}$$

This means that for each eigenvalue $\lambda$ of $A$ of multiplicity $> 1$, you get $\chi_B(\lambda) = 0$. If $m_A(\lambda_i) = 1$ then $\chi_{B}(\lambda_i) = w_i^2 $. We also have $\chi_B(x) = \chi_A(x) \cdot \sum_{i=1}^n \frac{w_i^2}{x-\lambda_i} $.

So you know that at least all the eigenvalues that are not simple are eigenvalues of $B$ with multiplicity diminished by at most $1$. And if $w_i = 0$ then it is also an eigenvalue.

$\endgroup$
5
  • 1
    $\begingroup$ "Secular equation" $\sum_{i=1}^n \frac{w_i^2}{x-\lambda_i}=0$ can be obtained as well by using maximisation under constraints : see dia #11 in the very nice presentation here. $\endgroup$ Commented Sep 1 at 12:03
  • $\begingroup$ @JeanMarie seems really interesting, I will take a deeper look at it later! $\endgroup$ Commented Sep 1 at 12:08
  • $\begingroup$ Just to make sure that $\chi_B(x) = \chi_A(x) \dot \sum^n_{i=1} \frac{w_i^2}{x - \lambda_i}$ only applies to $x$ that are not equal to $\lambda_i$ right? This is helpful. But I think this does not really save time since the program Im using (pytorch) does not calculate eigenvalues one-by-one. Do you have any suggestions? Thank you. $\endgroup$ Commented Sep 1 at 13:53
  • $\begingroup$ @WinnieXi if you use a computer, you might as well recompute it. In a practical manner, it is very unlikely for an eigenvalue to have algebraic multiplicity >1. Linking this with the comment of Jean Marie can be useful but computation wise I’m not sure it is really useful. I don’t know if this answers your question, but the probability to have common eigenvalue with an atom less distribution is 0 $\endgroup$ Commented Sep 1 at 15:37
  • 1
    $\begingroup$ Yeah that's what I got from your comments too. Thanks a lot tho. $\endgroup$ Commented Sep 2 at 9:02

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.