You are currently browsing the category archive for the ‘254A – random matrices’ category.

Suppose we have a large number of scalar random variables {X_1,\ldots,X_n}, which each have bounded size on average (e.g. their mean and variance could be {O(1)}). What can one then say about their sum {S_n := X_1+\ldots+X_n}? If each individual summand {X_i} varies in an interval of size {O(1)}, then their sum of course varies in an interval of size {O(n)}. However, a remarkable phenomenon, known as concentration of measure, asserts that assuming a sufficient amount of independence between the component variables {X_1,\ldots,X_n}, this sum sharply concentrates in a much narrower range, typically in an interval of size {O(\sqrt{n})}. This phenomenon is quantified by a variety of large deviation inequalities that give upper bounds (often exponential in nature) on the probability that such a combined random variable deviates significantly from its mean. The same phenomenon applies not only to linear expressions such as {S_n = X_1+\ldots+X_n}, but more generally to nonlinear combinations {F(X_1,\ldots,X_n)} of such variables, provided that the nonlinear function {F} is sufficiently regular (in particular, if it is Lipschitz, either separately in each variable, or jointly in all variables).

The basic intuition here is that it is difficult for a large number of independent variables {X_1,\ldots,X_n} to “work together” to simultaneously pull a sum {X_1+\ldots+X_n} or a more general combination {F(X_1,\ldots,X_n)} too far away from its mean. Independence here is the key; concentration of measure results typically fail if the {X_i} are too highly correlated with each other.

There are many applications of the concentration of measure phenomenon, but we will focus on a specific application which is useful in the random matrix theory topics we will be studying, namely on controlling the behaviour of random {n}-dimensional vectors with independent components, and in particular on the distance between such random vectors and a given subspace.

Once one has a sufficient amount of independence, the concentration of measure tends to be sub-gaussian in nature; thus the probability that one is at least {\lambda} standard deviations from the mean tends to drop off like {C \exp(-c\lambda^2)} for some {C,c > 0}. In particular, one is {O( \log^{1/2} n )} standard deviations from the mean with high probability, and {O( \log^{1/2+\epsilon} n)} standard deviations from the mean with overwhelming probability. Indeed, concentration of measure is our primary tool for ensuring that various events hold with overwhelming probability (other moment methods can give high probability, but have difficulty ensuring overwhelming probability).

This is only a brief introduction to the concentration of measure phenomenon. A systematic study of this topic can be found in this book by Ledoux.

Read the rest of this entry »

In this supplemental set of notes we derive some approximations for {n!}, when {n} is large, and in particular Stirling’s formula. This formula (and related formulae for binomial coefficients {\binom{n}{m}} will be useful for estimating a number of combinatorial quantities in this course, and also in allowing one to analyse discrete random walks accurately.

Read the rest of this entry »

In preparation for my upcoming course on random matrices, I am briefly reviewing some relevant foundational aspects of probability theory, as well as setting up basic probabilistic notation that we will be using in later posts. This is quite basic material for a graduate course, and somewhat pedantic in nature, but given how heavily we will be relying on probability theory in this course, it seemed appropriate to take some time to go through these issues carefully.

We will certainly not attempt to cover all aspects of probability theory in this review. Aside from the utter foundations, we will be focusing primarily on those probabilistic concepts and operations that are useful for bounding the distribution of random variables, and on ensuring convergence of such variables as one sends a parameter {n} off to infinity.

We will assume familiarity with the foundations of measure theory; see for instance these earlier lecture notes of mine for a quick review of that topic. This is also not intended to be a first introduction to probability theory, but is instead a revisiting of these topics from a graduate-level perspective (and in particular, after one has understood the foundations of measure theory). Indeed, I suspect it will be almost impossible to follow this course without already having a firm grasp of undergraduate probability theory.

Read the rest of this entry »

Starting in the winter quarter (Monday Jan 4, to  be precise), I will be giving a graduate course on random matrices, with lecture notes to be posted on this blog.  The topics I have in mind are somewhat fluid, but my initial plan is to cover a large fraction of the following:

  • Central limit theorem, random walks, concentration of measure
  • The semicircular and Marcenko-Pastur laws for bulk distribution
  • A little bit on the connections with free probability
  • The spectral distribution of GUE and gaussian random matrices; theory of determinantal processes
  • A little bit on the connections with orthogonal polynomials and Riemann-Hilbert problems
  • Singularity probability and the least singular value; connections with the Littlewood-Offord problem
  • The circular law
  • Universality for eigenvalue spacing; Erdos-Schlein-Yau delocalisation of eigenvectors and applications

If time permits, I may also cover

  • The Tracy-Widom law
  • Connections with Dyson Brownian motion and the Ornstein-Uhlenbeck process; the Erdos-Schlein-Yau approach to eigenvalue spacing universality
  • Conjectural connections with zeroes of the Riemann zeta function

Depending on how the course progresses, I may also continue it into the spring quarter (or else have a spring graduate course on a different topic – one potential topic I have in mind is dynamics on nilmanifolds and applications to combinatorics).

Archives

RSS Google+ feed

  • An error has occurred; the feed is probably down. Try again later.
Follow

Get every new post delivered to your Inbox.

Join 3,875 other followers