You are currently browsing the tag archive for the ‘Stein’s method’ tag.

Consider the sum {S_n := X_1+\ldots+X_n} of iid real random variables {X_1,\ldots,X_n \equiv X} of finite mean {\mu} and variance {\sigma^2} for some {\sigma > 0}. Then the sum {S_n} has mean {n\mu} and variance {n\sigma^2}, and so (by Chebyshev’s inequality) we expect {S_n} to usually have size {n\mu + O(\sqrt{n} \sigma)}. To put it another way, if we consider the normalised sum

\displaystyle  Z_n := \frac{S_n - n \mu}{\sqrt{n} \sigma} \ \ \ \ \ (1)

then {Z_n} has been normalised to have mean zero and variance {1}, and is thus usually of size {O(1)}.

In the previous set of notes, we were able to establish various tail bounds on {Z_n}. For instance, from Chebyshev’s inequality one has

\displaystyle  {\bf P}(|Z_n| > \lambda) \leq \lambda^{-2}, \ \ \ \ \ (2)

and if the original distribution {X} was bounded or subgaussian, we had the much stronger Chernoff bound

\displaystyle  {\bf P}(|Z_n| > \lambda) \leq C \exp( - c \lambda^2 ) \ \ \ \ \ (3)

for some absolute constants {C, c > 0}; in other words, the {Z_n} are uniformly subgaussian.

Now we look at the distribution of {Z_n}. The fundamental central limit theorem tells us the asymptotic behaviour of this distribution:

Theorem 1 (Central limit theorem) Let {X_1,\ldots,X_n \equiv X} be iid real random variables of finite mean {\mu} and variance {\sigma^2} for some {\sigma > 0}, and let {Z_n} be the normalised sum (1). Then as {n \rightarrow \infty}, {Z_n} converges in distribution to the standard normal distribution {N(0,1)_{\bf R}}.

Exercise 1 Show that {Z_n} does not converge in probability or in the almost sure sense (in the latter case, we think of {X_1,X_2,\ldots} as an infinite sequence of iid random variables). (Hint: the intuition here is that for two very different values {n_1 \ll n_2} of {n}, the quantities {Z_{n_1}} and {Z_{n_2}} are almost independent of each other, since the bulk of the sum {S_{n_2}} is determined by those {X_n} with {n > n_1}. Now make this intuition precise.)

Exercise 2 Use Stirling’s formula from Notes 0a to verify the central limit theorem in the case when {X} is a Bernoulli distribution, taking the values {0} and {1} only. (This is a variant of Exercise 2 from those notes, or Exercise 2 from Notes 1. It is easy to see that once one does this, one can rescale and handle any other two-valued distribution also.)

Exercise 3 Use Exercise 9 from Notes 1 to verify the central limit theorem in the case when {X} is gaussian.

Note we are only discussing the case of real iid random variables. The case of complex random variables (or more generally, vector-valued random variables) is a little bit more complicated, and will be discussed later in this post.

The central limit theorem (and its variants, which we discuss below) are extremely useful tools in random matrix theory, in particular through the control they give on random walks (which arise naturally from linear functionals of random matrices). But the central limit theorem can also be viewed as a “commutative” analogue of various spectral results in random matrix theory (in particular, we shall see in later lectures that the Wigner semicircle law can be viewed in some sense as a “noncommutative” or “free” version of the central limit theorem). Because of this, the techniques used to prove the central limit theorem can often be adapted to be useful in random matrix theory. Because of this, we shall use these notes to dwell on several different proofs of the central limit theorem, as this provides a convenient way to showcase some of the basic methods that we will encounter again (in a more sophisticated form) when dealing with random matrices.

Read the rest of this entry »

Archives

RSS Google+ feed

  • An error has occurred; the feed is probably down. Try again later.
Follow

Get every new post delivered to your Inbox.

Join 3,307 other followers