Yes, indeed there is. They definitely beat us additive combinatorialists to this whole game of making exact algebraic structures out of approximate ones. But I recently discovered that essentially the same idea occurs in a paper of Fournier from 1977 called “Sharpness of Young’s inequality for convolution“.

He is interested in the following question: for which groups G is the Young inequality, which states (say) that , basically sharp? I.e. in which circumstances can you improve it to the statement that , where c is less than one?

He shows that one can do this provided that G does not have compact open subgroups (note that G does not need to be abelian).

You can sort of see the connection to what we’re talking about here – if there is some f for which is about as big as then this means (after expanding out) that the support of f has got to be 99 percent closed under addition. He then goes on to show that the support must be close to a true subgroup.

So I sort of think of Fournier as the father of approximate, or “99 percent” theory, unless someone can point me to an earlier paper :-)

]]>I tend to view the Freiman-Ruzsa theory as more of an “approximate group theory” than a purely combinatorial theory, to emphasise the additional algebraic (rather than combinatorial) structure present.

For instance, if H and L are two finite subgroups of an abelian group G, and K is an integer, then it is not hard to show that the following two statements are equivalent: (1) H has a subset of size at least which is contained in a coset of L, and (2) H is covered by K translates of L. Indeed, the equivalence is essentially the first homomorphism theorem from undergraduate algebra. Ruzsa’s lemma is thus the robust version of this theorem.

I am hoping that eventually we will be able to “robustify” many other results in group theory (or ring theory, etc.) to extend to approximate groups. For instance, as I mentioned in an earlier post, Freiman’s theorem can be viewed as a robust version of the classification of finite abelian groups as the direct product of cyclic groups (or perhaps, a classification of the finitely generated abelian groups as the direct product of cyclic groups and copies of Z). Ultimately we might hope to manipulate “1%-structured objects” (such as sets which are closed under addition 1% of the time) with the almost the same degree of versatility as we currently enjoy while manipulating “100%-structured objects” (such as sets which are closed under addition 100% of the time, i.e. groups). An intermediate stage in this program would be to study “99%-structured objects”, e.g. a finite group with 1% of its elements removed and another 1% of garbage elements added. Here we seem to have a very good theory, due to the use of majority-vote and similar tools to “clean up” all the noise.

]]>In abstract combinatorial settings you can expect (if things go your way and you can maintain the assumptions after deleting A’) an additional factor of moving from (1) to (2). Some sort of integrality gap. And getting rid of it is at times impossible, at times open, and at times hard. Here, somehow the additive structure makes (1) and (2) quite easily, almost the same. (In combinatorial geometry you also sometimes see similar phenomena.)

]]>