Download PDF version of this article PDF


Errors, deceptions, and ambiguity

Stan Kelly-Bootle, Author

Three years ago, to the very tick, my first Curmudgeon column appeared in ACM Queue to the rapturous, one-handed claps of the silent majority. Since then my essays have alternated intermittently with those of other grumpy contributors. With this issue (muffled drumroll), I’m proud to announce a Gore-like climate change in the regime that will redefine the shallow roots of ACJ (agile computer journalism, of which more anon). The astute ACM Queue Management (yes, there is such—you really must read the opening pages of this magazine!) has offered me the chance to go solo. For the next few Queues, at least, I am crowned King Curmudgeon, the Idi Amin of Moaners, nay, Supreme General Secretary of the Complaining Party! “I am Sir Oracle, and when I ope my lips, let no dog bark!”1 Or rather, under the new dispensation, I command you to bark back via [email protected] with your own pet peeves or counter-moans, which I promise to print if printable (subject to as light an editing as the Law dictates).

I also plan to pose posers and ask FUQs (frequently unanswered questions), as was my wont in the Unix Review Devil’s Advocate columns of yore (1984-2000). As then, huge, literally invaluable prizes are offered for your answers and selected responses that meet my unpublished “Rules & Regulations.” Suffice it to say that the customary bribes are encouraged; friends and relations enjoy traditional nepotistic advantages (in the old days my mother inevitably won the white Rolls-Royce convertible); and tedious accuracy scores lower than cunning disinformation. An ongoing challenge goes out to readers who encounter risible misprints and howlers in the computer literature, not excluding my own usually deliberate mishtakes.

Any errors you detect will be judged against the expected authority and inerrancy of the source. Thus, the many marketeering deviations from the untrampled snow-white truth will seldom rate highly unless, say, Gates or Jobs drops a real whopper. I allow new retrospective findings of false prophecies, but not the well-worn ones: at one end we have the quite plausible 1947 prediction by T. J. Watson (three IBM computers will more than meet the world’s needs) and, at the other, the less plausible Bill Gates (“640K ought to be enough for anybody,” 1981), which reflected the sad fact that IBM PC designers spurned the larger, linear-address space of the Motorola MC68000 microprocessor in favor of the Intel 8088. Bill later topped this faux pas: “The Internet? We are not interested in it” (1993). He also made several other ill-timed predictions about OS/2 (optimism unjustified) and Java (pessimism unjustified), but I’m loath to cast bricks: back in 1942 I swore undying love to a certain Nelly Moorcroft in a Liverpool jigger (back alley) while the Nazi bombs were falling...but I digress.

A particular source from which mistakes are sought is the much-cited Wikipedia. Wiki, as in Caesar’s Weni, Widi, Wiki,2 has arrived, looked around, and conquered. It has reached the top 10 in the most-visited site list, a remarkable achievement for a noncommercial project started in 2001.

Wikipedia, and the Web/Internet generally, received glowing praise from UK Education Secretary Alan Johnson as “an incredible source for good in education” for both teachers and pupils. “Wikipedia,” he told a Schoolteachers’ Union conference in April, “enables anybody to access information which was once the preserve only of those who could afford the subscription to Encyclopaedia Britannica and could spend the time necessary to navigate its maze of indexes and content pages.” He’s correct about the cost but rather out of date on the “maze,” since the Britannica is now available online with the usual search and hyperlink features to replace the chore of heavy page turning. Predictably, some teachers groaned at the Wiki endorsement, having suffered from the increasingly blatant plagiarism by students innocently unable to distinguish fact from opinion and deliberate distortion. Cartoons show children boasting A levels in new subjects called “Cut and Paste” and “Drag and Drop.” Well, I suppose they are modern skills to be honed and rewarded. Forget the content, dig the layout!

Debate on Wiki’s accuracy has been growing since the site launched. That has been the fate of all reference works, as Diderot and Lavoisier will confirm. In any “live” and growing corpus (Wiki now has more than 6 million entries), some errors are inevitable. Facts do change, don’t yer know? The problem is how to judge overall reliability from the occasional headline-grabbing “disasters” (usually malevolently planted by the disgruntled), which are uncovered and, claim the pro-Wikimites, promptly corrected.

As with our vast suppositories [sic] of software, however, the notion of “unknown bugs” and “undetected bugs” looms as a Zen demon (if I can mix my creeds). Is there an acceptable level of reliability? If so, can we assess it from sampling? Do a thousand minor Wiki typos count more than, say, an entry on Islam or a map of Africa that may unintentionally provoke violence or threats of violence? When experts disagree, should both sides be represented with balanced word counts? Can the cranks have their day on Wiki? Cranks may be tomorrow’s gurus. The obvious paradox is that in the normal “look-up” situation, we seldom consult Wiki in the domains in which we are reasonably knowledgeable. Maintaining accuracy therefore calls for dedicated specialists with the time to spare for regular and thorough vetting.

The volatility of online data remains a mixed curse/blessing (see this column, April and July/August 2005, ibid.). What was a major manual effort for editors of the Great Soviet Encyclopedia as past heroes were “delisted” (or “unentered”?) is now a few deft clicks away. A remarkable case is reported where the Russian subscribers to the encyclopedia were co-opted into helping the State revisionists: in 1953 they were sent a page entry on the Bering Strait and asked to insert it after cutting out the same-size entry on the disgraced Lavrentiy Beria—thus preserving the pagination.3

Just as a thought experiment, imagine an order from up high to eliminate the entry at STALIN, Joseph. “We need exact 321-page in-situ replacement—any ideas?” Back comes the nervous editor: “We find that STALLMAN, Richard would fit alphabetically and even culturally. His FSF supports our aims in bringing down Microsoft and the other wicked capitalists who steal and sell People’s software. But big problem, Boss—we are having trouble covering Stallman in less than 400 pages.”

Back to Wiki’s weracity, and enter Larry Sanger. As a disillusioned Wiki co-founder, his decision to set up a rival online resource deserves our careful attention. Enter Citizendium as the Wiki-killer! The choice of name is hardly an encouraging sign, yet, I say, the more references the better. Beware the man browsing one site. Although Sanger expresses concern over the errors in Wikipedia, his main beef is the underlying structure and ethos that throws doubt on its ability to ensure reliability. He calls the Wiki management community dysfunctional, invoking the crushing term “Rigid Egalitarianism.” In particular, he dislikes the freedom with which anonymous Wiki contributors with unproved credentials can provide new and edit old information. Citizendium will correct this loophole by applying strict control over who does what and with transparent accountability. Just like a “real” encyclopedia, you may say. Sanger’s team has much ground to make up, and I wish them well. One might add a sad note: Are we seeing another “damn good cause” afflicted with a bad dose of the “schisms”?

If This Be Error...

Returning to the theme of reader cooperation, I offer a brief, yet apposite example of a newly exposed, half-forgotten mis-forecast in the June 24, 1974, Science section of Time magazine in which expert climatologists warned of global cooling. The next Ice Age loometh, and one geo-guru (University of Toronto, no less) was even more precise: “I don’t believe that the world’s present population is sustainable if there are three years like 1972 in a row.”

Here are two examples of how higher marks qualify for my Doryphoric Palme d’Or (recall that a doryphore is “one who takes excessive delight in spotting small errors,” where excessive and small remain undefined) when mistakes are published in authoritative texts.

First, Bill Bryson’s A Short History of Nearly Everything (Broadway Books, 2003) is an excellent introduction to the natural sciences for the laid-back laity (I’ve given it as a prezzie to all my grandchildren), partly because Bryson is a fine writer rather than a trained scientist. He better appreciates the hurdles for those who failed math and physics not through lack of wit but because of poor presentation and motivation. However, he states that “seven one-thousandths” is “0.007 percent” and repeats this deception by offering “six one-thousandths” as “0.006 percent.” I hear the grumpy, unfair reaction: How can we believe anything this Bryson tells us?

Second, because the author/editor/commentator of God Created the Integers (Running Press, 2005)4 is famed cosmologist Stephen Hawking, the reviewer, John Stillwell (American Mathematical Monthly, Mathematical Association of America, March 2007) can hardly resist a smirk in finding several “more or less serious errors together with other distinctly misleading statements.” Hawking writes, “Riemann recognized that in spaces of nonconstant curvature bodies may move about without stretching” (page 820). This is a topic well within the author’s domain of competence, yet my readers will surely spot the mistake noted by Stillwell. The word nonconstant should be constant! Is this the Orwellian peak of misspeak? For YES read NO? For FALSE read TRUE? As with my love for Nelly Moorcroft. But reversed. My declared constancy proved inconstant.

Proof that Hawking simply wrote carelessly is revealed later on the same page where Riemann himself (genuflect, genuflect) is quoted with the correct proposition (bodies can move without stretching in spaces of constant curvature). Later, Hawking goes wrong again: “Of course, space need not be flat, it need not even be of constant curvature as it must be for the sum of angles of a triangle to be constant.” Wrong or very misleading, claims Stillwell. Constant curvature does not imply the sum of angles are invariant! The very sphere (idealized) most of us inhabit is constant curvature, but we all know (wake up at the back) that angle-sums vary with area. What Hawking should have said was that “zero curvature” guarantees angle-sum invariance.

Jack Be Agile

I hope agile is still the in-vogue programmers’ paradigmatic predicate. Writing a few months ahead of publication has always been a hazard in our fair but unfairly volatile trade. I see signs of the nimble overtaking the agile, presumably by changing lanes and ignoring the speed limits.

I know that Joshua E. Smith designed an XML-compatible language called Nimble in 1999, yet this name seemed based on nimble as a folksy synonym for agile. Nobody sings, “Jack be agile, Jack be quick,” do they? But can we expect Nimble programming to become a more widely entrenched general concept crowned with the accolade Methodology? Incidentally, wordsmiths will notice that Nimble is billed not as XML-compatible or XML-conforming but as XML-conformant. Readers are invited to submit their definitions of these three terms and explain how they might differ.

Title Theme

The financial consultants Deloitte splash the banner,


proving, if proof were needed, that reading undelimited “words” can be a pain. In fact, it can lead to dire ambiguity as in “man’s laughter” and “manslaughter.” The comic’s straight man described his sex life as “infrequent,” to which Henny Youngman responded, “Is that one word or two?” And how many see the connection between “atone” and “at one?” Reader prizes for similar examples.

My collapsed, self-referential headline to this column, “Alloneword,” is now embedded in computer newspeak, and further borrowed for a rock band. It will be familiar to all those who have ever had to dictate or speech-spell their e-mail or Web addresses. That branch of mankind must include all my readers and, indeed, a large, ever-growing proportion of those who are wired into our Brave New World of Web. (The participle wired remains a quaint synonym for connected, even when that nirvana is achieved wirelessly.) Thus, we announce, “I’m joethejollyblogger, alloneword, at discountmousepads, alloneword, dot see-oh dot you-kay.”

By the way, don’t rush to register the confusing domain alloneword. It’s been “took!” ( houses the illustrated Figures of Speech by Mervyn Peake. Worth a visit).

Less worthwhile (I’m scarce able to mention it) is Twitterers (or twits as I prefer to call them) are a global community of underemployed addicts with sub-blog attention spans. Once registered (there don’t seem to be any tests for literacy or sanity), twits can submit realtime biographical sound “bytes” describing what they claim to be doing at that very moment. On the bright side, the max allowed burst of narcissism is 140 words per twitter. On the dark side, you are invited to read what other twits are up to. My entry dated “now”: “Just got out of bed. Marmite butty as per usual. Logged into Slashing my wrists. Bye-bye all.”


  1. Shakespeare, W. Merchant of Venice (act 1 scene 1). Incidentally, this play may cast light on the Bard’s sexual preferences: “My ventures are not in one bottom trusted.”
  2. In my school, Latin v’s were pronounced as w’s. In German and Polish we were taught to reverse this rule—most confusing. Caesar’s declaration has been much parodied: “Veni, vidi, volo in domum redire” (I came, I saw, I wanna go home) and “Veni, vidi, Visa” (I came, I saw, I shopped).
  3. I was reminded of this ancient anecdote by an entry in Wiki! Strangely, I now feel less inclined to believe it. It has that too-telling-to-be-true, urban-mythic feel. Why draw attention so openly to Beria’s downfall while Russia is going through the motions of mourning Stalin? Reader input will be rewarded.
  4. From Leopold Kronecker (1823-1891) and his famous dictum, “God made the integers; all else is the work of Man.” This is one of those many slick aphorisms that defy reasonable analysis (if you spot the pun on analysis, a branch of mathematics, forgive me). Both theists and atheists have problems with Kronecker. Theists can claim that a Real God, by definition, is perfectly capable of extending the discrete integers to the Real number continuum. Atheists say that “God don’t enter into the equation.” Early man invented the integers (how else can one impose taxes or count slaves?); later, after much trial and error, came the rationals or fractions (how else to divide the pie?), followed by the badly named irrationals, made fully Real and reasonable by Richard Dedekind and other demigods in the late 19th century.

STAN KELLY-BOOTLE (;, born in Liverpool, England, read pure mathematics at Cambridge in the 1950s before tackling the impurities of computer science on the pioneering EDSAC I. His many books include The Devil’s DP Dictionary (McGraw-Hill, 1981), Understanding Unix (Sybex, 1994), and the recent e-book Computer Language—The Stan Kelly-Bootle Reader ( Software Development Magazine has named him as the first recipient of the new annual Stan Kelly-Bootle Eclectech Award for his “lifetime achievements in technology and letters.” Neither Nobel nor Turing achieved such prized eponymous recognition. Under his nom-de-folk, Stan Kelly, he has enjoyed a parallel career as a singer and songwriter.


Originally published in Queue vol. 5, no. 4
Comment on this article in the ACM Digital Library

More related articles:

Nicole Forsgren, Eirini Kalliamvakou, Abi Noda, Michaela Greiler, Brian Houck, Margaret-Anne Storey - DevEx in Action
DevEx (developer experience) is garnering increased attention at many software organizations as leaders seek to optimize software delivery amid the backdrop of fiscal tightening and transformational technologies such as AI. Intuitively, there is acceptance among technical leaders that good developer experience enables more effective software delivery and developer happiness. Yet, at many organizations, proposed initiatives and investments to improve DevEx struggle to get buy-in as business stakeholders question the value proposition of improvements.

João Varajão, António Trigo, Miguel Almeida - Low-code Development Productivity
This article aims to provide new insights on the subject by presenting the results of laboratory experiments carried out with code-based, low-code, and extreme low-code technologies to study differences in productivity. Low-code technologies have clearly shown higher levels of productivity, providing strong arguments for low-code to dominate the software development mainstream in the short/medium term. The article reports the procedure and protocols, results, limitations, and opportunities for future research.

Ivar Jacobson, Alistair Cockburn - Use Cases are Essential
While the software industry is a fast-paced and exciting world in which new tools, technologies, and techniques are constantly being developed to serve business and society, it is also forgetful. In its haste for fast-forward motion, it is subject to the whims of fashion and can forget or ignore proven solutions to some of the eternal problems that it faces. Use cases, first introduced in 1986 and popularized later, are one of those proven solutions.

Jorge A. Navas, Ashish Gehani - OCCAM-v2: Combining Static and Dynamic Analysis for Effective and Efficient Whole-program Specialization
OCCAM-v2 leverages scalable pointer analysis, value analysis, and dynamic analysis to create an effective and efficient tool for specializing LLVM bitcode. The extent of the code-size reduction achieved depends on the specific deployment configuration. Each application that is to be specialized is accompanied by a manifest that specifies concrete arguments that are known a priori, as well as a count of residual arguments that will be provided at runtime. The best case for partial evaluation occurs when the arguments are completely concretely specified. OCCAM-v2 uses a pointer analysis to devirtualize calls, allowing it to eliminate the entire body of functions that are not reachable by any direct calls.

© ACM, Inc. All Rights Reserved.