I’ve just uploaded to the arXiv the short note “A remark on primality testing and the binary expansion“, submitted to the Journal of the Australian Mathematical Society. In this note I establish the following result: for any sufficiently large integer n, there exists an n-bit prime p, such that the numbers for are all composite. In particular, if one flips any one of the bits in the binary expansion of the prime p, one obtains a composite number. As a consequence, one obtains the (rather plausible) consequence that in order to (deterministically) test whether an n-bit integer is prime or not, one needs (in the worst-case) to read all n bits of the prime. (This question was posed to me by my colleague here at UCLA, Yiannis Moschovakis.)
Primes p of the form mentioned in the above result are somewhat rare at first; the first some prime is . But in fact, the argument in my note shows that the set of such primes actually has positive relative density inside the set of all primes. (Amusingly, this means that one can apply a theorem of Ben Green and myself and conclude that there are arbitrarily long arithmetic progressions of such primes, although I doubt that there is any particular significance or application to this conclusion.)
The same remark applies to other bases; thus, for instance, there exist infinitely many prime numbers with the property that if one changes any one of the base 10 digits of that number, one obtains a composite number.
(Presumably the first such number can be located by computer search, though I did not attempt to do so.) [Update, Feb 25: see comments.]
Heuristically, a random n-bit integer has roughly a 1/n chance (up to multiplicative constants such as ) of being prime. So amongst the 2n numbers for and a fixed prime p, one expects about O(1) of these numbers to be prime on the average. A more refined calculation (using the Hardy-Littlewood prime tuples conjecture) suggests that the expected number of primes exceeds 1 (in fact it should be asymptotically , where is the twin prime constant), which means that the main result is unlikely to be established purely by the first moment method. Fortunately, a little bit of elementary sieving using, of all things, the prime factors of Mersenne numbers, allows one to manipulate the expected number of primes to be significantly less than 1, to the point where standard upper bound sieves (in particular, the Selberg sieve) give satisfactory results.
It is nice for a change to write an elementary paper of
six eight pages in length; the only moderately advanced tools I use here are the prime number theorem in arithmetic progressions, and the standard upper bound from the Selberg sieve for any two-dimensional sieving problem (in which one is allowed to eliminate up to two residue classes mod q for every prime q).
[Update, Feb 25: It has been pointed out to me that the base 2 result was implicitly obtained by Cohen and Selfridge, and explicitly pointed out by Sun, using the covering congruence method of Erdős, so I have rewritten the paper to focus primarily on the case of general bases (to which it is not entirely clear that a set of covering congruences exists). The earlier version (which focused only on the base 2 case, and so is slightly simpler) can be found here.]
[Update, Feb 26: It should now also be possible to prove the following generalisation: for any base a and any r, there should exist infinitely many primes which become composite whenever one changes one of the digits, and also appends or deletes up to r of digits on either end of the digit expansion. My method comes very close to establishing this result, except for one annoying issue arising from the fact that numbers such as (with j, l = O(1)) can occasionally have a huge number of prime factors. This seems to be a rare event, though, so presumably one should be able to modify the argument to obtain this result. I’ll leave this for someone else to try, though. One should also be similarly able to replace “change one of the digits” to “delete or insert a digit at an arbitrary location”.]