A note on a conjecture concerning rank one perturbations of singular M-matrices

A conjecture from a paper by J. Bierkens and A.C.M. Ran concerning the location of eigenvalues of rank one perturbations of singular M-matrices is shown to be false in dimension four and higher, but true for dimension two, as well as for dimension three with an additional condition on the perturbation.


Introduction
Let H be a nonnegative n × n matrix, and let A = ρ(H)I − H. Here, ρ(H) denotes the spectral radius of H. Note that A has all its eigenvalues in the open right half plane, with the exception of zero. We assume that ρ(H) is a simple eigenvalue of H, i.e., both the geometric and the algebraic mutliplicity are one. Then zero is a simple eigenvalue of A. Further, let v and w be nonnegative vectors. For t > 0 consider the matrix B(t) = A + tvw . It was shown in [3], Lemma 2.11, that there is a t 0 > 0 such that for 0 < t < t 0 the matrix B(t) has all its eigenvalues in the open right half plane under a certain extra condition, the so-called NZP condition, which will be introduced shortly. It was also shown in [3] by means of a counterexample (see Example 2.15) that this does not hold for all t > 0. The counterexample in [3] is of size 6 × 6.
In order to state one of the conjectures in [3], which is the focus of this note, we need to introduce a condition called NZP (non-zero projection) in [3]. For this, consider a right unit eigenvector z r of H corresponding to ρ(H) and a left unit eigenvector z l of H corresponding to ρ(H). Note that z l and z r are unique up to multiplication by −1. The vectors v and w are said to satisfy NZP if the following hold: z l v = 0 and w z r = 0.
If H is irreducible, then z r and z l have positive entries (see, e.g., [2], Chapter 2, Theorem 2.10, part c). Hence, if v and w are nonnegative vectors and H is irreducible, then the condition NZP is automatically satisfied.
Assume that 0 is a simple eigenvalue of A. Let v and w be nonnegative vectors satisfying NZP, then there is a positive t 1 such that for t > t 1 the eigenvalues of B(t) = A + tvw are again all in the open right half plane.
In [3], Theorem 2.7 the conjecture was already shown to be true in the two-dimensional case assuming that either the zero eigenvalue of A is simple, or w v = 0. In fact, in these cases the eigenvalues of B(t) are both in the open right half plane for all t > 0. The purpose of this short note is to show that the conjecture stated above is false in general, even when we make the extra assumption that H is irreducible, but true in the three-dimensional case under the assumtion that H is irreducible and w v = 0. It will be shown also that the latter condition is necessary.
To put the conjecture in context, the problem of studying the behaviour of eigenvalues of parametrized rank one perturbations B(t) = A + tvw of a matrix A has been studied for a long time, see e.g., [1,6]. Mostly, however, only the behaviour for small values of t has been studied ( [8,16], see also [5,10] for more detailed analysis). The problem of considering the behaviour of the eigenvalues for large values of t was considered in [12,13]. Restrictions on A, v and w, allowing only certain structured matrices, where considered in e.g. [9], but only for generic vectors v and w. For the non-structured situation, without restrictions on A, v and w, and generic vectors v and w, see [4,11,14,15]. In [3] the matrix A and the vectors v and w are restricted in a different manner: A is a singular M -matrix, and v and w are nonnegative vectors. In [3], Lemma 2.11 it was shown that for small values of the parameter t > 0 the eigenvalues of B(t) are all in the open right half plane when the condition NZP is satisfied (in particular when H is irreducible). The conjecture above asks explicitly for the behaviour of the eigenvalues of B(t) for large values of t > 0.
An important role in our arguments is played by the following formula for the characteristic polynomial of B(t): let m A (λ) denote the minimal polynomial of A, then Note that formally the equation only holds for λ not one of the eigenvalues of A, but since both sides are polynomials, they then coincide everywhere.

Counterexample and general remarks
We start by providing a counterexample in dimension four. Let Then H is nonnegative and irreducible, and the eigenvalues of H are 0, 0.1 ± 0.1i, 0.2 and so ρ(H) = 0.2. Consider A = ρ(H)I 4 − H, and B(t) = A + tvw . Note that H, v and w satisfy all the conditions of the conjecture. Following [13], Theorem 17, see also [12], Theorem 4.1 we have that for t → ∞ the eigenvalues of B(t) behave as follows: one is positive, and approximately equal to tw v + O(1), and the other three converge to the roots of the polynomial In this case, p vw (λ) is given by For this specific case, this is equal to It should be noted that the statements in [12], Theorem 4.1 are made for generic vectors v and w only. However, the fact that v and w have positive entries allows us to apply the results of that theorem in this particular case, which can be seen by a close inspection of the proofs in [12], Lemma 2.1 and Theorem 4.1. The crucial condition is that w v = 0. Alternatively, this can be seen by applying [13], Theorem 17, which does not depend on generiticity of the vectors v and w. "Generic" here is taken in the algebraic-geometric sense, that is, the set of vectors (v, w) for which the stated property is not true is contained in the zero set of a finite number of polynomials in the 2n variables which are the coordinates of v and w.
Obviously, once a counterexample is found in dimension four, counterexamples in any higher dimension can be constructed easily. In fact, if we denote the Jordan block with eigenvalue zero of size n by J n , we see that apart from the 10 −4 in the 4, 1-entry H is equal to 0.1I 4 + J 4 . In dimension n we can construct H as follows: take 0.1I n + J n and insert in the n, 1-entry the number 10 −n . Then again ρ(H) = 0.2. An appropriate choice of positive vectors v and w will lead to a counterexample in any dimension. We will not go in further detail here.
The counterexample to the conjecture leaves the question what the con-ditions on A, v and w are for the eigenvalues of the matrix B(t) to be in the open right half plane for large values of t. We assume that n > 3, as the cases n = 2 and n = 3 will be discussed in the next section. We make use of [13], Theorem 17 and Lemma 16. In fact, from Theorem 17 (ii) and (iii) in [13] we see that it is a necessary condition that w v = 0, as otherwise there is at least One can then apply the Routh-Hurwitz criterion to this polynomial to see whether its roots are in the open right half plane (see, e.g., [7], Section 13.4).

Small dimensions
We begin this section with a small variation on Lemma 2.10 in [3], specified to the situation at hand. Proof. From Lemma 2.10 in [3], which does not require the condition NZP, we have that any real eigenvalue µ of B(t) is nonnegative. So it remains to show that the extra condition NZP implies that µ > 0. To see this, assume that for some B(t) has eigenvalue zero for some t > 0, and let x = 0 be such that B(t)x = 0. Then 0 = z l B(t)x = z l (A + tvw )x = tz l vw x. By NZP z v = 0, and since also t > 0, we must have w x = 0. But then 0 = B(t)x = Ax+tvw x = Ax. Since the zero eigenvalue of A is algebraically simple it follows that x is a nonzero multiple of z r . But then w x = 0 impies that w z r = 0, which contradicts NZP. So B(t) is invertible for all positive t.
Observe that if H is irreducible, then as we observed before, the condition NZP is automatically satisfied and the zero eigenvalue of A is algebraically simple in that case. For n = 2 it was already shown in [3], Theorem 2.7, part (i) that both eigenvalues will be in the open right half plane for all t > 0. However, for completeness we present a proof here as well, which also provides more detail about the behaviour of the eigenvalues for large values of t.
It will be shown in a later example that the condition w v = 0 is necessary when n = 3.
For n = 2 it will be shown that the eigenvalues of B(t) are in the open right half plane for all t > 0. Indeed, as H is irreducible, A has two different eigenvalues, 0 and a positive number µ. Hence det A = 0 and trace A = µ. Then from [13], Proposition 2 (see also [12], Proposition 2.2) we have that the eigenvalues of B(t) which are not eigenvalues of A are the solutions of w (λI 2 − A) −1 v = 1 t . Multiplying left and right with the characteristic polynomial p A (λ) of A, one sees that this is equivalent to λ being a solution of where adj A is the adjugate matrix of A. In turn, this is equivalent to λ being a solution of If the solutions of this equation are both real, then they have to be positive, by Lemma 3.1. If they are non-real, then the real part of the two solutions λ 1,2 is equal to Re (λ 1,2 ) = 1 2 (µ + tw v) > 0, and hence the eigenvalues lie in the open right half plane. Note that this does not require the condition w v = 0.
In fact, when w v = 0, we can be more precise about the behaviour of the eigenvalues for t → ∞. Either solving for the eigenvalues explicitly from (2) and (1), or by using [13], Theorem 17 (ii) and (iii), for large t (see also Remark 18 there), we see that one of the eigenvalues of B(t) will go to infinity along the real line and this eigenvalue is equal to tw v + O(1) . The other eigenvalue is approximately equal to ζ = w adj (A)v w v . Because w v = 0 and adj (A) is a nonnegative matrix, ζ > 0. By [13],Theorem 17 ( (v) for large values of t this second eigenvalue is equal to ζ In the case that w v = 0 we can also be more precise about the behaviour of the eigenvalues for t → ∞. Solving explicitly for the eigenvalues we see that the eigenvalues will go to infinity along the line Reλ 1,2 = 1 2 µ as t → ∞.
The case n = 3.
In this case we shall show that for large values of t the eigenvalues of B(t) are eventually in the open right half plane under the condition w v = 0. There are two cases to consider: the first is that for large values of t all eigenvalues are real. This is the easy case, as by Lemma 3.1 for large values of t the eigenvalues of B(t) have to be positive. In the second case, the matrix B(t) has one real eigenvalue, again positive by Lemma 3.1, and a pair of complex eigenvalues. In fact, the real eigenvalue must go to infinity along the positive real axis as tw v + O(1) according to [13], Theorem 17 (ii) and (iii). The complex eigenvalues then have to approximate the two roots of the polynomial p vw (λ). So it remains to prove that the roots of p vw (λ) are in the open right half plane.
Since A is a singular M -matrix and zero is a simple eigenvalue of A by the irreducibility of the nonnegative matrix H, the characteristic polynomial of A is of the form p A (λ) = λ 3 + p 2 λ 2 + p 1 λ, with p 2 = −trace A < 0 and p 1 = 0. Then, by direct computation, or from [13], Lemma 16, The roots of p vw are given by where D is the discriminant. Obviously, this depends on the sign of D. The case where the two roots are real has already been solved. So, we may assume D < 0. Then the real part of the two roots λ 1,2 is given by By assumption w v > 0, so the sign of the real part of λ 1,2 is equal to the sign of −(p 2 w v + w Av) = −w (p 2 I + A)v. Now p 2 = −trace A = −(a 11 + a 22 + a 33 ) and so As A is an irreducible M -matrix, its off-diagonal entries are nonpositive, and its diagonal entries are positive, by [2], Chapter 6, Theorem 4.16, part 4. Since w and v are nonnegative vectors with w v = 0, the product w (p 2 I + A)v is negative, as at least for one i = 1, 2, 3 we will have v i > 0 and w i > 0. It follows that the real part of the roots λ 1,2 of p vw is positive, and hence λ   The next example shows that the condition w v = 0 cannot be missed when n = 3. is equal to p B(t) (λ) = λ 3 − 3λ 2 + 3λ + t(λ − 7).
Obviously, as t → ∞ one of the roots will go to 7. By the proof of [13], Theorem 17 (iii) there are two eigenvalues of B(t) going to infinity, with a Puiseux series expansion given by √ t   and w A 2 v = 1, and, conforming to [13], Theorem 17 (iii), there are three eigenvalues going to infinity as t → ∞. In fact, since the characteristic polynomial of B(t) is given by (1 − λ) 3 + (t − 1), for t > 1 these eigenvalues are given by λ j = 1 + 3 √ t − 1e 2πij/3 for j = 0, 1, 2. Note that again two of them are in the open left half plane.