Ex1+,…+xn = E(x1) +…+E(xn)

should have the parenthesis on the left side of equation thus:

E(x1+…+xn) = E(x1)+…+E(xn)

]]>If we assume finite second moment then only “multiplicativity” is needed. By multiplicativity, I mean . This translates to orthogonality if the expectations are $\latex 0$.

An application would be Weyl’s result: if is a sequence of integers then for almost all , the sequence is uniformly distributed . For this, we take .

With this mutiplicativity assumption, one can simply get results for random variables with varying expectations: Consider nonnegative random variables with varying expectation, but then we would divide by the expectation of the sum of the first $n$ random variables instead of to get the right normalization. For example, the expectations can go to arbitrary slowly as long as . But the expectations can also go to ! They cannot increase arbitrary fast: something like for seems to be the limit for a strong law, while for norm (weak) convergence, one can get arbitrary close to , that is we can have .

I think these are good exercises for using and testing the limits of this "lacunary subsequence" trick.

]]>(i) Of course, I wanted to say “the pairwise independence does not imply the stationarity, even if the ‘s are identically distributed”, rather than just “the pairwise independence does not imply the stationarity”.

(ii) Tao’s result, with the pairwise independence rather than with the complete independence, can be extended in a standard manner to the case when the ‘s take values in an arbitrary separable Banach space (say).

]]>*[LaTeX code corrected; the problem was the lack of a space between "latex" and the LaTeX code. Also, the curly braces were unnecessary. -T]*