We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. (Donald Knuth, “Literate programming”, paraphrasing Tony Hoare)
After all my other advice on how to write papers, I should add one counterbalancing note: there is a danger in being too perfectionist, and in trying to make every part of a paper as “optimal” as possible. After all the “easy” improvements have been made to a paper, one encounters a law of diminishing returns, in which any further improvements either require large amounts of time and effort, or else require some tradeoffs in other qualities of the paper.
For instance, suppose one has a serviceable lemma that suffices for the task of proving the main theorems of the paper at hand. One can then try to “optimise” this lemma by making the hypotheses weaker and the conclusion stronger, but this can come at the cost of lengthening the proof of the lemma, and obscuring exactly how the lemma fits in with the rest of the paper. In the reverse direction, one could also “optimise” the same lemma by replacing it with a weaker (but easier to prove) statement which still barely suffices to prove the main theorem, but is now unsuitable for use in any later application. Thus one encounters a tradeoff when one tries to improve the lemma in one direction or another. (In this case, one resolution to this tradeoff is to have one formulation of the lemma stated and proved, and then add a remark about the other formulation, i.e. state the strong version and remark that we only use a special case, or state the weak version and remark that stronger versions are possible.)
Carefully optimising results and notations in the hope that this will help future researchers in the field is a little risky; later authors may introduce new insights or new tools which render these painstakingly optimised results obsolete. The only time when this is really profitable is when you already know of a subsequent paper (perhaps a sequel to the one you are already writing) which will indeed rely heavily on these results and notations, or when the current paper is clearly going to be the definitive paper in the subject for a long while.
If you haven’t already written a rapid prototype for your paper, then optimising a lemma may in fact be a complete waste of time, because you may find later on in the writing process that the lemma will need to be modified anyway to deal with an unforeseen glitch in the original argument, or to improve the overall organisation of the paper.
I have sometimes seen authors try to optimise the length of the paper at the expense of all other attributes, in the mistaken belief that brevity is equivalent to simplicity. While it can be that shorter papers are simpler than longer ones, this is generally only true if the shortness of the paper was achieved naturally rather than artificially. If brevity was attained by removing all examples, remarks, whitespace, motivation, and discussion, or by striking out “redundant” English phrases and relying purely on mathematical abbreviations (e.g. instead of “For all”, etc.) and various ungrammatical contractions, then this is generally a poor tradeoff; somewhat ironically, a paper which has been overcompressed may be viewed by readers as being more difficult to read than a longer, gentler, and more leisurely treatment of the same material. (See also “Give appropriate amounts of detail.”)
On the other hand, optimising the readability of the paper is always a good thing (except when it is at the expense of rigour or accuracy), and the effort put into doing so is appreciated by readers.
7 comments
Comments feed for this article
5 October, 2007 at 1:14 am
Radu Grigore
I think Knuth never wrote something called “Code Complete”.
5 October, 2007 at 7:15 am
Terence Tao
Thanks for pointing that out! I have corrected the reference.
7 November, 2008 at 2:23 am
Anonymous
I feel that I am reading an article written by Bertrand Russell. Very well written!
7 November, 2008 at 7:09 am
Jonathan Vos Post
“After all the ‘easy’ improvements have been made to a paper, one encounters a law of diminishing returns…”
I model this as a logistic curve phenomenon. No writing is ever done; merely asymptotic to done.
The late Dean of American Science Fiction, Robert A. Heinlein, gave a well-known list of 5 Rules for professional authors.
The last rule is, and I paraphrase, after the piece is finished and submitted to the target market(s), and is rejected, don’t assume that you’ve written badly, and start another round of rewrites. If an editor is willing to pay for something if a rewrite is done, then this is a necessary and sufficient reason to rewrite. Instead, keep resubmitting it to other markets, and put your energy into writing new pieces.
Never underestimate the value of a good editor, even when the editor
seems to say bad things.
Of course, Heinlein was mostly talking about commercial fiction and nonfiction sales, and not the strange world of academic publishing in peer reviewed journals.
13 June, 2017 at 1:00 am
Anonymous
Quite so.
24 February, 2011 at 6:02 pm
Advice on writing paper « Success doesn't come overnight
[…] paper, and in particular in selecting good notation and giving appropriate amounts of detail. But one should not over-optimise the […]
15 February, 2017 at 5:49 pm
More links from math professors – Lucy's World of Technical Communication
[…] One about not letting the perfect be the enemy of the good […]