To express a random variable as the difference of two, I think finite moment is not needed, it is needed when we express expectation as the difference of two.

Thanks.

]]>I have a question. If you are an infinite number of random variables. You can still use the strong law of large numbers? please more explain for me.

best regards ]]>

Ex1+,…+xn = E(x1) +…+E(xn)

should have the parenthesis on the left side of equation thus:

E(x1+…+xn) = E(x1)+…+E(xn)

]]>If we assume finite second moment then only “multiplicativity” is needed. By multiplicativity, I mean . This translates to orthogonality if the expectations are $\latex 0$.

An application would be Weyl’s result: if is a sequence of integers then for almost all , the sequence is uniformly distributed . For this, we take .

With this mutiplicativity assumption, one can simply get results for random variables with varying expectations: Consider nonnegative random variables with varying expectation, but then we would divide by the expectation of the sum of the first $n$ random variables instead of to get the right normalization. For example, the expectations can go to arbitrary slowly as long as . But the expectations can also go to ! They cannot increase arbitrary fast: something like for seems to be the limit for a strong law, while for norm (weak) convergence, one can get arbitrary close to , that is we can have .

I think these are good exercises for using and testing the limits of this "lacunary subsequence" trick.

]]>