January/February issue of acmqueue

The January/February issue of acmqueue is out now


  Download PDF version of this article PDF

For Want of a Comma, the Meaning Was Lost

What does punctuation have to do with software development?

Jef Raskin, University of Chicago

Odder things have happened in publishing, but not by much. The bestseller list on Amazon (as I write this) is topped by Lynne Truss’s book on... punctuation. It’s outselling Harry Potter. Starting from the title, Eats, Shoots and Leaves (Gotham Books, 2004), which changes meaning drastically when the comma is omitted, the book is an entertaining romp through the advantages of writing correctly and clearly with the proper use of punctuation.

So, why a discussion of a funny book on punctuation, however popular, in ACM Queue? Partly because of the totally wretched state of most software documentation. Clarity and precision are of the utmost importance in our field. As Donald Knuth—one of the finest programmers of our time—pointed out in Literate Programming (Cambridge University Press, 1992), it is advantageous to write our algorithmic intentions in English (or your favorite natural language) before writing in code.

I would be delighted if programmers who work for me and students who study with me knew their “its” from their “it’s” and when to use a dash to join compound adjectives. What is called for is the ability to express unambiguously what you intend. That’s not easy, but studying books on the subject can help. I’ll introduce Ms. Truss’s book (and another) in the form of an open letter to the author.

Dear Ms. Truss,

Your recent book, Eats, Shoots and Leaves: The Zero Tolerance Approach to Punctuation, is admirable and readable. I have always found our population’s poor punctuational prowess puzzling. The popularity of your book will, I hope, ameliorate this problem. Your basic thesis is that good punctuation makes for clear, unambiguous communication—“to prevent awkward misunderstandings between writer and reader.” As important as preventing misunderstandings might be in a mystery novel or magazine article, it is doubly important in technical writing where “awkward misunderstandings” can transmogrify into expensive mistakes.

It was kind of you to mention a product I created, the Macintosh computer, and I am pleased that you use one. You told of the fact that it was only after you had used the machine for years that you discovered that you could type proper dashes. Had you known that it was designed by a writer (or had it come with a decent, paper, manual), you might have expected to find a full family of dashes from the first. All three varieties of dash (hyphen, en dash, and double or em dash) that you mention in your book are included. I was surprised to see that you omitted one use of the en dash—namely, when it establishes the hierarchy among multiple adjectives when they together modify a noun. But, fortunately, all this information and more is available in another eccentric, authoritative, and occasionally very funny book on writing clearly, and one that is aimed specifically at those of us in the sciences and technology: Lyn Dupré’s superb Bugs in Writing: A Guide to Debugging Your Prose (second edition, Addison-Wesley, 1998).

Oddly, you mention only two uses of the ellipsis, and omit a third: to indicate continuation of a sequence: Jan, Feb, Mar, ... is an example. You should also, for completeness, tell readers that, if the end of the sequence is shown, a comma is required after the ellipsis: 2, 4, 6, ..., 22. Toward the end of the book you seem to scold computerdom as a place where English is led to the stocks and flagellated until it is raw. But, by way of compensation, today’s computer user has access to a true ellipsis and need not make do with three consecutive full stops (or, as we call them here, periods). To those who know their punctuation, the computer—unlike the obsolescent typewriter—gives them access to a near-professional level of typographical specificity and control. In this way, the computer is an aid rather than a hindrance to proper punctuation.

The downside of this advantage is that some publishers demand that you present “camera-ready copy” according to their specifications. This puts a typesetting burden on the author, which may be unsupportable if the author has a different word processor than that expected by the publisher.

I very much agree with you that the “American” practice of forcing punctuation inside a close quote is absurd. In my most recent book I insisted on British usage in this regard; it has the very great advantage of being logical. We may have thrown your tea into Boston Harbor, but that is no reason to not accept your punctuation. In my instance, I was allowed to violate the house style rules because I was writing on a subject where precision trumps style: computer-human interfaces.

For example, Microsoft’s Windows software tells you “To stop printing, press Control + .”. I claim that clarity demands both periods that surround the closing quotation mark in the previous sentence.

The point here is that, because of inattention to punctuation, the advice provided by Microsoft is ambiguous. Is one to press the Control key and then the plus key (the case where the reader assumes that the period is there to end the sentence)? I tried that the first time and was disappointed. Or does it mean to press and hold the Control key and then tap a plus sign and then a period? That fails, too. Finally, I guessed that the plus sign means “and while holding it tap”. If it had said, “To stop printing, press and hold Control and tap the period key.”, there would have been little problem.

Perhaps not all programmers are illiterate, just the ones who write instructions.

I would have argued more strongly for “serial” commas, and not taken your laissez-faire attitude toward them. I would say this on the grounds that such commas eliminate potential ambiguity and their omission confers no benefit. Being ignorant of the world of alcoholic beverages, for example, I am not sure if I am being asked for three or four kinds of drinks when directed to “bring us screwdrivers, martinis, and sherry and soda.” You do make this point but fail to insist upon it. Unless the rule is applied consistently, you do not know in any particular case which is meant. My rule is that each item in a list always deserves its own comma; being last should not mean being deprived.

It seems that you occasionally don’t understand the rationale for a computer convention. For example: it is not that people “experiment” with asterisks for *emphasis* instead of italics. It is just that some computer systems will not accurately transmit italics, whereas asterisks flow unimpeded from one machine to another. You say that “new technologies” present information “in a nonlinear way, through an exponential series of lateral associations.” It only looks like that; we linearize our way through the material; we are creatures who live in linear time. It is also not true that when you scroll text your “eyes remain static, while the material flows past.” In fact you scroll and stop, and only then do you read, and you read in the normal manner.

And when you say that “you can’t write comments in the margin of your screen to be discovered by another reader fifty years down the line,” I imagine that you perhaps have not seen threaded discussions and “Wikis”. Not only can I add marginal notes, but because of the bidirectional nature of the Web, a comment can become incorporated as part of the material that all subsequent readers see, not just the chance reader of my copy. For example, one comment I made on the Stanford Encyclopedia of Philosophy (http://plato.stanford.edu) became incorporated into the work (with due credit). My marginal comments on my paper copy of the OED (Oxford English Dictionary), however, are unlikely to become memorialized so accessibly.

I think it is naive to suggest that the Internet “cannot be used as an instrument of oppression.” There are many countries where it is carefully controlled so as to be just such a tool.

For me, and many others I know, the computer has not been an enemy but a spur to writing well. Perhaps it has made the low state of writing ability more visible, but I do not think that it has made it worse.

One last critique of your book: it has no index. I’d like to take the opportunity to say that a computer can generate one for you.

These small problems aside, I hope that many fellow computerists come to obey your wise strictures. It is hard work to make words into good servants who will faithfully carry your intended meaning from your mind to your readers’. Forceful application of well-placed and properly chosen punctuation will always be necessary to keep them doing their duty.


feedback@acmqueue.com or www.acmqueue.com/forums

JEF RASKIN, adjunct professor of computer science at the University of Chicago, is best known for his book, The Humane Interface (Addison-Wesley, 2000), and for having created the Macintosh project at Apple. He holds many interface patents, consults for companies around the world, and is often called upon as a speaker at conferences, seminars, and universities. His current project, The Humane Environment (http://humane.sourceforge.net/home/index.html), is attracting interest in both the computer science and business worlds.

© 2004 ACM 1542-7730/04/0700 $5.00


Originally published in Queue vol. 2, no. 5
see this item in the ACM Digital Library



Ivar Jacobson, Ian Spence, Ed Seidewitz - Industrial Scale Agile - from Craft to Engineering
Essence is instrumental in moving software development toward a true engineering discipline.

Andre Medeiros - Dynamics of Change: Why Reactivity Matters
Tame the dynamics of change by centralizing each concern in its own module.

Brendan Gregg - The Flame Graph
This visualization of software execution is a new necessity for performance profiling and debugging.

Ivar Jacobson, Ian Spence, Brian Kerr - Use-Case 2.0
The Hub of Software Development


(newest first)

Leave this field empty

Post a Comment:

© 2017 ACM, Inc. All Rights Reserved.