The Chowla conjecture asserts, among other things, that one has the asymptotic $\displaystyle \frac{1}{X} \sum_{n \leq X} \lambda(n+h_1) \dots \lambda(n+h_k) = o(1)$

as ${X \rightarrow \infty}$ for any distinct integers ${h_1,\dots,h_k}$, where ${\lambda}$ is the Liouville function. (The usual formulation of the conjecture also allows one to consider more general linear forms ${a_i n + b_i}$ than the shifts ${n+h_i}$, but for sake of discussion let us focus on the shift case.) This conjecture remains open for ${k \geq 2}$, though there are now some partial results when one averages either in ${x}$ or in the ${h_1,\dots,h_k}$, as discussed in this recent post.

A natural generalisation of the Chowla conjecture is the Elliott conjecture. Its original formulation was basically as follows: one had $\displaystyle \frac{1}{X} \sum_{n \leq X} g_1(n+h_1) \dots g_k(n+h_k) = o(1) \ \ \ \ \ (1)$

whenever ${g_1,\dots,g_k}$ were bounded completely multiplicative functions and ${h_1,\dots,h_k}$ were distinct integers, and one of the ${g_i}$ was “non-pretentious” in the sense that $\displaystyle \sum_p \frac{1 - \hbox{Re}( g_i(p) \overline{\chi(p)} p^{-it})}{p} = +\infty \ \ \ \ \ (2)$

for all Dirichlet characters ${\chi}$ and real numbers ${t}$. It is easy to see that some condition like (2) is necessary; for instance if ${g(n) := \chi(n) n^{it}}$ and ${\chi}$ has period ${q}$ then ${\frac{1}{X} \sum_{n \leq X} g(n+q) \overline{g(n)}}$ can be verified to be bounded away from zero as ${X \rightarrow \infty}$.

In a previous paper with Matomaki and Radziwill, we provided a counterexample to the original formulation of the Elliott conjecture, and proposed that (2) be replaced with the stronger condition $\displaystyle \inf_{|t| \leq X} \sum_{p \leq X} \frac{1 - \hbox{Re}( g_i(p) \overline{\chi(p)} p^{-it})}{p} \rightarrow +\infty \ \ \ \ \ (3)$

as ${X \rightarrow \infty}$ for any Dirichlet character ${\chi}$. To support this conjecture, we proved an averaged and non-asymptotic version of this conjecture which roughly speaking showed a bound of the form $\displaystyle \frac{1}{H^k} \sum_{h_1,\dots,h_k \leq H} |\frac{1}{X} \sum_{n \leq X} g_1(n+h_1) \dots g_k(n+h_k)| \leq \varepsilon$

whenever ${H}$ was an arbitrarily slowly growing function of ${X}$, ${X}$ was sufficiently large (depending on ${\varepsilon,k}$ and the rate at which ${H}$ grows), and one of the ${g_i}$ obeyed the condition $\displaystyle \inf_{|t| \leq AX} \sum_{p \leq X} \frac{1 - \hbox{Re}( g_i(p) \overline{\chi(p)} p^{-it})}{p} \geq A \ \ \ \ \ (4)$

for some ${A}$ that was sufficiently large depending on ${k,\varepsilon}$, and all Dirichlet characters ${\chi}$ of period at most ${A}$. As further support of this conjecture, I recently established the bound $\displaystyle \frac{1}{\log \omega} |\sum_{X/\omega \leq n \leq X} \frac{g_1(n+h_1) g_2(n+h_2)}{n}| \leq \varepsilon$

under the same hypotheses, where ${\omega}$ is an arbitrarily slowly growing function of ${X}$.

In view of these results, it is tempting to conjecture that the condition (4) for one of the ${g_i}$ should be sufficient to obtain the bound $\displaystyle |\frac{1}{X} \sum_{n \leq X} g_1(n+h_1) \dots g_k(n+h_k)| \leq \varepsilon$

when ${A}$ is large enough depending on ${k,\varepsilon}$. This may well be the case for ${k=2}$. However, the purpose of this blog post is to record a simple counterexample for ${k>2}$. Let’s take ${k=3}$ for simplicity. Let ${t_0}$ be a quantity much larger than ${X}$ but much smaller than ${X^2}$ (e.g. ${t = X^{3/2}}$), and set $\displaystyle g_1(n) := n^{it_0}; \quad g_2(n) := n^{-2it_0}; \quad g_3(n) := n^{it_0}.$

For ${X/2 \leq n \leq X}$, Taylor expansion gives $\displaystyle (n+1)^{it} = n^{it_0} \exp( i t_0 / n ) + o(1)$

and $\displaystyle (n+2)^{it} = n^{it_0} \exp( 2 i t_0 / n ) + o(1)$

and hence $\displaystyle g_1(n) g_2(n+1) g_3(n+2) = 1 + o(1)$

and hence $\displaystyle |\frac{1}{X} \sum_{X/2 \leq n \leq X} g_1(n) g_2(n+1) g_3(n+2)| \gg 1.$

On the other hand one can easily verify that all of the ${g_1,g_2,g_3}$ obey (4) (the restriction ${|t| \leq AX}$ there prevents ${t}$ from getting anywhere close to ${t_0}$). So it seems the correct non-asymptotic version of the Elliott conjecture is the following:

Conjecture 1 (Non-asymptotic Elliott conjecture) Let ${k}$ be a natural number, and let ${h_1,\dots,h_k}$ be integers. Let ${\varepsilon > 0}$, let ${A}$ be sufficiently large depending on ${k,\varepsilon,h_1,\dots,h_k}$, and let ${X}$ be sufficiently large depending on ${k,\varepsilon,h_1,\dots,h_k,A}$. Let ${g_1,\dots,g_k}$ be bounded multiplicative functions such that for some ${1 \leq i \leq k}$, one has $\displaystyle \inf_{|t| \leq AX^{k-1}} \sum_{p \leq X} \frac{1 - \hbox{Re}( g_i(p) \overline{\chi(p)} p^{-it})}{p} \geq A$

for all Dirichlet characters ${\chi}$ of conductor at most ${A}$. Then $\displaystyle |\frac{1}{X} \sum_{n \leq X} g_1(n+h_1) \dots g_k(n+h_k)| \leq \varepsilon.$

The ${k=1}$ case of this conjecture follows from the work of Halasz; in my recent paper a logarithmically averaged version of the ${k=2}$ case of this conjecture is established. The requirement to take ${t}$ to be as large as ${A X^{k-1}}$ does not emerge in the averaged Elliott conjecture in my previous paper with Matomaki and Radziwill; it thus seems that this averaging has concealed some of the subtler features of the Elliott conjecture. (However, this subtlety does not seem to affect the asymptotic version of the conjecture formulated in that paper, in which the hypothesis is of the form (3), and the conclusion is of the form (1).)

A similar subtlety arises when trying to control the maximal integral $\displaystyle \frac{1}{X} \int_X^{2X} \sup_\alpha \frac{1}{H} |\sum_{x \leq n \leq x+H} g(n) e(\alpha n)|\ dx. \ \ \ \ \ (5)$

In my previous paper with Matomaki and Radziwill, we could show that easier expression $\displaystyle \frac{1}{X} \sup_\alpha \int_X^{2X} \frac{1}{H} |\sum_{x \leq n \leq x+H} g(n) e(\alpha n)|\ dx. \ \ \ \ \ (6)$

was small (for ${H}$ a slowly growing function of ${X}$) if ${g}$ was bounded and completely multiplicative, and one had a condition of the form $\displaystyle \inf_{|t| \leq AX} \sum_{p \leq X} \frac{1 - \hbox{Re}( g(p) \overline{\chi(p)} p^{-it})}{p} \geq A \ \ \ \ \ (7)$

for some large ${A}$. However, to obtain an analogous bound for (5) it now appears that one needs to strengthen the above condition to $\displaystyle \inf_{|t| \leq AX^2} \sum_{p \leq X} \frac{1 - \hbox{Re}( g(p) \overline{\chi(p)} p^{-it})}{p} \geq A$

in order to address the counterexample in which ${g(n) = n^{it_0}}$ for some ${t_0}$ between ${X}$ and ${X^2}$. This seems to suggest that proving (5) (which is closely related to the ${k=3}$ case of the Chowla conjecture) could in fact be rather difficult; the estimation of (6) relied primarily of prior work of Matomaki and Radziwill which used the hypothesis (7), but as this hypothesis is not sufficient to conclude (5), some additional input must also be used.