Hoi Nguyen, Van Vu, and myself have just uploaded to the arXiv our paper “Random matrices: tail bounds for gaps between eigenvalues“. This is a followup paper to my recent paper with Van in which we showed that random matrices of Wigner type (such as the adjacency matrix of an Erdös-Renyi graph) asymptotically almost surely had simple spectrum. In the current paper, we push the method further to show that the eigenvalues are not only distinct, but are (with high probability) separated from each other by some negative power of . This follows the now standard technique of replacing any appearance of discrete Littlewood-Offord theory (a key ingredient in our previous paper) with its continuous analogue (inverse theorems for small ball probability). For general Wigner-type matrices (in which the matrix entries are not normalised to have mean zero), we can use the inverse Littlewood-Offord theorem of Nguyen and Vu to obtain (under mild conditions on ) a result of the form

for any and , if is sufficiently large depending on (in a linear fashion), and is sufficiently large depending on . The point here is that can be made arbitrarily large, and also that no continuity or smoothness hypothesis is made on the distribution of the entries. (In the continuous case, one can use the machinery of Wegner estimates to obtain results of this type, as was done in a paper of Erdös, Schlein, and Yau.)

In the mean zero case, it becomes more efficient to use an inverse Littlewood-Offord theorem of Rudelson and Vershynin to obtain (with the normalisation that the entries of have unit variance, so that the eigenvalues of are with high probability), giving the bound

for (one also has good results of this type for smaller values of ). This is only optimal in the regime ; we expect to establish some eigenvalue repulsion, improving the RHS to for real matrices and for complex matrices, but this appears to be a more difficult task (possibly requiring some *quadratic* inverse Littlewood-Offord theory, rather than just *linear* inverse Littlewood-Offord theory). However, we can get some repulsion if one works with larger gaps, getting a result roughly of the form

for any fixed and some absolute constant (which we can asymptotically make to be for large , though it ought to be as large as ), by using a higher-dimensional version of the Rudelson-Vershynin inverse Littlewood-Offord theorem.

In the case of Erdös-Renyi graphs, we don’t have mean zero and the Rudelson-Vershynin Littlewood-Offord theorem isn’t quite applicable, but by working carefully through the approach based on the Nguyen-Vu theorem we can almost recover (1), except for a loss of on the RHS.

As a sample applications of the eigenvalue separation results, we can now obtain some information about *eigenvectors*; for instance, we can show that the components of the eigenvectors all have magnitude at least for some with high probability. (Eigenvectors become much more stable, and able to be studied in isolation, once their associated eigenvalue is well separated from the other eigenvalues; see this previous blog post for more discussion.)

## 17 comments

Comments feed for this article

3 April, 2015 at 11:29 am

Maurice de GossonIs it interesting?

3 April, 2015 at 12:56 pm

Anonymousdear Terry,

page 6 section 3.2

you write: Let u be an eigenvalue of A_n(p).

if I am not mistaken it should be let u be an eigenvector.

[Thanks, this will be corrected in the next version of the ms. -T.]4 April, 2015 at 2:22 am

AnonymousIs it possible to extend the results to some class of random (infinite dimensional) hermitian operators (e.g. to show that the probability of a spectrum gap less than is bounded by whenever is bounded by a fixed positive power of a sufficiently small )?

4 April, 2015 at 6:56 am

Terence TaoI think one could make some artificial infinite-dimensional random operator model for which the currently known techniques would extend, but for the models that people care the most about (e.g. discrete or continuous random Schrodinger operators), these sorts of methods aren’t yet applicable (they rely too much on having all of the entries having non-zero variance, and so don’t cope well with the band-limited or otherwise localised nature of operators such as Schrodinger operators). But there is certainly an effort to push a lot of the Wigner theory in the direction of more physically relevant matrices and operators, such as discrete random Schrodinger operators (e.g. there are now some results on band-limited matrices which are sort of an interpolant between the random Schrodinger models and the Wigner models).

6 April, 2015 at 9:54 pm

AnonymousIs it possible to (effectively) compute an upper bound for the ratio (for all sufficiently large n)?

[Yes; see the remark after Theorem 2.6 of the paper. -T.]4 April, 2015 at 5:54 am

Romulo, O R³. (@romulor3)On the fourth line of your post do you mean “adjacency matrix” instead of “adjacency graph”?

“(such as the adjacency graph of an Erd\H{o}s-Renyi graph)”

[Corrected, thanks – T.]4 April, 2015 at 7:36 am

Святослав ШмидтProfessor, could you explain how to do involution for the hypercomplex numbers?

4 April, 2015 at 5:08 pm

Zoltan B.Dear Terry,

I wonder if you could comment about analogous statements for non-Hermitian matrices? For example, suppose you generate a random graph as in G(n,p) but now throwing in directed edges. Would the results you describe here (simple spectrum, no eigenvector with zero entry) still hold and how difficult would they be to prove?

4 April, 2015 at 8:34 pm

Terence TaoThe techniques in this paper break down (among other things, they rely on the Cauchy interlacing law, which breaks down in non-Hermitian settings), but I have a student looking into this problem who is trying some other methods.

5 April, 2015 at 8:00 am

XiangYuReblogged this on xiangyu14.

7 April, 2015 at 12:36 pm

LamDear Terry,

You wrote “almost surely have simple spectrum”, did you mean “with high probability”?

[Clarified to “asymptotically almost surely” – T.]30 March, 2016 at 11:17 am

ArikSo, if we look at specific types of matrices then randomly and asymptomatically they will likely fall into a small spectrum of distinct eigenvalues, potentially with the sign flipped. If this is symmetric you can look for those eigenvalues and find those types of matrices? Where might that be useful? Or is that not within the scope of study?

30 March, 2016 at 2:26 pm

ArikWould appreciate an explanation, thanks sir

10 January, 2018 at 8:55 am

MkFor an adjacency matrix A, is it then possible to give a similar bound on the sum of the gaps? Or in other words the difference between the maximum and minimum eigenvalues of A?

11 January, 2018 at 5:10 am

Terence TaoThere is now a quite precise understanding of the distribution of the extreme eigenvalues of an Erdos-Renyi adjacency matrix, see e.g., https://arxiv.org/abs/1103.3869

10 January, 2018 at 8:58 am

MkAlso, if the entries of a random matrix are iid and Poisson distributed, then can we say something about how the corresponding eigenvalues are distributed?

11 January, 2018 at 5:14 am

Terence Taofor iid Poisson matrices this is covered by the circular law (for the bulk distribution), see https://arxiv.org/abs/1109.3343 . Local spacing universality is still open for this (the best result so far, due to Van and myself, requires four matching moments with a Gaussian). For symmetric Poisson matrices the semicircular law is classical (due to Pastur?) and the universality of spacing is also known (see e.g. https://arxiv.org/abs/1407.5606 ).