Thanks for the quick reply! I see. Theorem 2 is what I want. But I am a little bit confused by $c_1$ in that theorem. Is $c_1$ a universal constant. If not, what does $c_1$ depends on?

]]>This is done in the original paper of Erdos linked to above. (Dilworth’s theorem is not mentioned by name in that paper, but it is essentially proven there.)

]]>this statement? I am not familiar with the Dilworth theorem so I don’t quite see how it can be used to calculate the sum of the largest binomial coefficients. Thanks in advance! ]]>

Thanks!

]]>Well, certainly *some* discorrelation between the signs is required; if all the signs are constrained to be equal, for instance, then there is an enormous amount of concentration regardless of the presence or absence of arithmetic structure. But the iid hypothesis is probably way too strong, and one should have analogous results in the presence of weak correlation, although most of the methods used so far are not able to handle such correlation. (There is beginning to be a little work in this direction for the Littlewood-Offord problems arising from random regular graphs.)

On the other hand, the current techniques to prove Littlewood-Offord theorems should extend without much difficulty to the case of independent but not identically distributed variables (and there are probably results of this form in the literature already); identical distribution is largely a technical convenience rather than an essential hypothesis (as long as all the distributions of individual variables obey their hypotheses with a suitable level of uniformity).

]]>The random walks here are not normalised by n (or by sqrt(n)), so the law of large numbers (or the central limit theorem) is not directly relevant, although the central limit theorem is certainly consistent with the concentration probability being of order 1/sqrt(n).

The Littlewood-Offord inequality (Theorem 1) is valid for more general random walks (we remark on this in the paper), but for the problem of computing quantities such as p_d(n,Delta) precisely, one needs to have the Bernoulli distribution on the nose (otherwise the combinatorial tools such as Sperner’s theorem and Dilworth’s theorem will not apply).

]]>this interesting problem before. But if the random terms in the partial sum are iid Bernoulli {-1,+1} with probabilities p(1)=q(-1)=1/2, then won’t the strong law of large numbers guarantee that the small ball probability converges almost surely to unity as n tends to infinity, at least for some dimensions? Also, how is the problem changed if the iid Bernoulli random variables in the problem have probability p(1)>q(-1)? Generated by a biased coin for example. I’m thinking of discrete random walks that are either transient or recurrent with Brownian motion being the limit of a random walk where the step sizes get smaller and smaller. In three dimensions or higher Brownian motion is transient and should leave a ball, however large, around any point never to return. Whereas in one and two dimensions, Brownian motion is recurrent and should return to a neighbourhood however small, of any point, infinitely often. I’m just curious as to whether these points are relevent. ]]>