Formal and informal approaches to C++ mastery
Excuses, excuses, excuses!
An inquiry into contracts and the Next Big Thing
Dedesignating and other linguistic hazards
Puns and allusions
And the perils of indecision. The latest musings of Stan Kelly-Bootle.
Buyer (and seller) beware
Is there an "out there" out there? There are always anniversaries, real or concocted, to loosen the columnist's writer's block and/or justify the intake of alcohol. I'll drink to that to the fact that we are blessed with a reasonably regular solar system providing a timeline of annual increments against which we can enumerate and toast past events. Hic semper hic. When the drinking occurs in sporadic and excessive bursts, it becomes known, disapprovingly, as "bingeing." I'm tempted to claim that this colorful Lincolnshire dialect word binge, meaning soak, was first used in the boozing-bout sense exactly 200 years ago.
"A lot of what, and about where?" I hear you cry. One question at a time, I reply. First, there's too much of everything these days, and, second, it's happening all over. Furthermore, everybody's doing it. As a contemporary Wordsworth might say: "The Web is too much with us, late and soon, getting and browsing we lay waste our powers." There is a glut of unfiltered information proving more dangerous than Alexander Pope's "A Little Learning" where "shallow draughts intoxicate the brain."
I've told you a googol times or more: Don't exaggerate! And, less often, I've ever-so-gently urged you not to understate. Why is my advice ignored? Why can't you get IT... just right, balanced beyond dispute? Lez Joosts Mildews, as my mam was fond of sayin, boxing both my ears with equal devotion. Follow the Middle Way as Tao did in his Middle Kingdom. Or "straight down the middle," as golfer Bing Crosby used to croon. His other golf song was "The Wearing of the Green," but such digressions run counter to my straight, plow-on-ahead advice.
By the time these belles-lettres reach you, a brand new year will be upon us. Another Year! Another Mighty Blow! as Tennyson thundered. Or as Humphrey Lyttelton (q.g.) might say, "The odious odometer of Time has clicked up another ratchette of entropic torture." Less fancifully, as well as trying hard not to write 2007 on our checks, many of us will take the opportunity to reflect on all the daft things we did last year and resolve not to do them no more. Not to mention all the nice things we failed to do. I have in mind the times when I missed an essential semicolon, balanced by the occasions when inserting a spurious one was equally calamitous.
My 'aphorisme du jour' allows me to roam widely in many directions, some of which, I hope, will be timely and instructive for Queue readers.
No, I’m not cashing in on that titular domino effect that exploits best sellers. The temptations are great, given the rich rewards from a gullible readership, but offset, in the minds of decent writers, by the shame of literary hitchhiking. Thus, guides to the Louvre become The Da Vinci Code Walkthrough for Dummies, milching, as it were, several hot cows on one cover. Similarly, conventional books of recipes are boosted with titles such as The Da Vinci Cookbook—Opus Dei Eating for the Faithful. Dan Brown’s pseudofiction sales stats continue to amaze, cleverly stimulated by accusations of plagiarism and subsequent litigation (Dan found not guilty).
You may well expect from my title that I’m about to plumb the depths of Nassim Nicholas Taleb’s theories on catastrophe and quasi-empirical randomness. I, in turn, expect that you’ve already read (or certainly read of) Taleb’s best-selling The Black Swan—The Impact of the Highly Improbable (Allen Lane, 2006) dealing with life’s innate uncertainties and how to expect or even cope with the unexpected.
Errors, deceptions, and abmiguity Three years ago, to the very tick, my first Curmudgeon column appeared in ACM Queue to the rapturous, one-handed claps of the silent majority. Since then my essays have alternated intermittently with those of other grumpy contributors. With this issue (muffled drumroll), I'm proud to announce a Gore-like climate change in the regime that will redefine the shallow roots of ACJ (agile computer journalism, of which more anon). The astute ACM Queue Management (yes, there is such - you really must read the opening pages of this magazine!) has offered me the chance to go solo. For the next few Queues, at least, I am crowned King Curmudgeon, the Idi Amin of Moaners, nay, Supreme General Secretary of the Complaining Party! "
My sermon-text this grumpy month is Matt Barton’s article “The Fine Art of Computer Programming” (http://www.freesoftwaremagazine.com/articles/focus-software_as_art), in which he extols the virtues of what is widely called literate programming. As with the related terms literary and literature, we have ample room for wranglings of a theological intensity, further amplified by disputes inherent in the questions: “Is computer science or art?” and “What do programmers need to know?” Just as we must prefer agile to clumsy programming, it’s hard to knock anything literate.
Taking measure of measurement The Texas rancher is trying to impress the English farmer with the size of his property. "I can drive out at dawn across my land, and by sundown I still haven't reached my ranch's borders." The Englishman nods sympathetically and says, "Yes, yes, I know what you mean - I have a car like that, too."
Adopting this architectural style is no cure-all.
When asked which advances in computing technology have most dazzled me since I first coaxed the Cambridge EDSAC 1 1 into fitful leaps of calculation in the 1950s, I must admit that Apple’s iPod sums up the many unforeseen miracles in one amazing, iconic gadget.
Not a day goes by that a large amount of spam doesn’t get past the two filters that I have in place (one on the server and one on my mail client). Most of this e-mail is annoying and some of it dangerous. But I have finally come to peace with spam and it no longer bothers me. How did I do that, you ask? I have learned to respect, even love, spam’s malicious beauty. I want to share my journey to inner peace, hopeful that you will find happiness too.
Chasing citations through endless, mislabeled nodes Many are said to have said, "If I can't take it with me, I'm not going!" I've just said it, but that hardly counts. Who, we demand, said or wrote it first? It's what I call (and claim first rights on) a FUQ (frequently unanswerable question, pronounced fook to avoid ambiguity and altercation). Yogi Berra's famous advice was "You can look it up," meaning, in fact, "Take my word on this." He knew quite well that few had the means or patience to wade through the records. Nowadays, of course, as we quip in Unix, it's easier done than sed.
Compliance. The mere mention of it brings to mind a harrowing list of questions and concerns. For example, who is complying and with what? With so many standards, laws, angles, intersections, overlaps, and consequences, who ultimately gets to determine if you are compliant or not? How do you determine what is in scope and what is not? And why do you instantly think of an audit when you hear the word compliance?
Is it just a matter of semantics?
A call to Transylvania may be needed.
Dominic Behan once asked me in a rare sober moment (for both of us): “What’s the point of knowing something if others don’t know that you know it?”1 To which I replied with the familiar, “It’s not what you don’t know that matters, it’s what you know that ain’t so.” I was reminded of these dubious epistemological observations while reading Stephen Sparkes’ interview with Steve Ross-Talbot in the March 2006 issue of ACM Queue.2 In promoting Robin Milner’s pi-calculus as the provably reliable backbone for BPM (business process management), Ross-Talbot eases our fears of the arcane, abstract pi-calculus axiomatics by stressing that the layman/programmer “would never need to see the algorithms...never need to read the literature, unless you were having trouble sleeping at night.”
We work in an industry that prides itself on “changing the world,” one that chants a constant mantra of innovation and where new products could aptly be described as “this year’s breakthrough of the century.” While there are some genuine revolutions in the technology industry, including cellphones, GPS (global positioning system), quantum computing, encryption, and global access to content, the vast majority of new product introductions are evolutionary, not revolutionary. Real technical breakthroughs are few and far between. Most new products are just a recycling of an earlier idea.
A persistent rule of thumb in the programming trade is the 80/20 rule: “80 percent of the useful work is performed by 20 percent of the code.” As with gas mileage, your performance statistics may vary, and given the mensurational vagaries of body parts such as thumbs (unless you take the French pouce as an exact nonmetric inch), you may prefer a 90/10 partition of labor. With some of the bloated code-generating meta-frameworks floating around, cynics have suggested a 99/1 rule—if you can locate that frantic 1 percent.
Isn’t it a shame the way the term realtime has become so misused? I’ve noticed a slow devolution since 1982, when realtime systems became the main focus of my research, teaching, and consulting. Over these past 20-plus years, I have watched my beloved realtime become one of the most overloaded, overused, and overrated terms in the lexicon of computing. Worse, it has been purloined by users outside of the computing community and has been shamelessly exploited by marketing opportunists.
I dedicate this essay in memoriam to Jef Raskin (March 9, 1943 - February 26, 2005.) Many more authoritative tributes than I can muster continue to pour in, and no doubt a glorious Festschrift will be forthcoming from those who admired this remarkable polymath. “Le don de vivre a passé dans les fleurs.”
I’m sick of hearing all the whining about how outsourcing is going to migrate all IT jobs to the country with the lowest wages. The paranoia inspired by this domino theory of job migration causes American and West European programmers to worry about India, Indian programmers to worry about China, Chinese programmers to worry about the Czech Republic, and so on. Domino theorists must think all IT jobs will go to the Republic of Elbonia, the extremely poor, fourth-world, Eastern European country featured in the Dilbert comic strip.
In the past few years people have convinced themselves that they have discovered an overlooked form of data. This new form of data is semi-structured. Bosh! There is no new form of data. What folks have discovered is really the effect of economics on data typing—but if you characterize the problem as one of economics, it isn’t nearly as exciting. It is, however, much more accurate and valuable. Seeing the reality of semi-structured data clearly can actually lead to improving data processing (in the literal meaning of the term). As long as we look at this through the fogged vision of a “new type of data,” however, we will continue to misunderstand the problem and develop misguided solutions to address it.
Multicore is the new hot topic in the latest round of CPUs from Intel, AMD, Sun, etc. With clock speed increases becoming more and more difficult to achieve, vendors have turned to multicore CPUs as the best way to gain additional performance. Customers are excited about the promise of more performance through parallel processors for the same real estate investment.
The Ninth World Multiconference SCI (Systematics, Cybernetics, and Informatics) 2005 has attracted more attention than its vaporific title usually merits by accepting a spoof paper from three MIT graduate students. The Times (of London, by default, of course) ran the eye-catching headline, “How gibberish put scientists to shame” (April 6, 2005). One of the students, Jeremy Stribling, explains how they had developed a computer program to generate random sequences of technobabble in order to confirm their suspicions that papers of dubious academicity were bypassing serious, or indeed, any scrutiny. In fact, the students claim ulterior, financial motives behind this lack of proper peer review.
User-defined overloading is a drug. At first, it gives you a quick, feel-good fix. No sense in cluttering up code with verbose and ugly function names such as IntAbs, FloatAbs, DoubleAbs, or ComplexAbs; just name them all Abs. Even better, use algebraic notation such as A+B, instead of ComplexSum(A,B). It certainly makes coding more compact. But a dangerous addiction soon sets in. Languages and programs that were already complex enough to stretch everyone’s ability suddenly get much more complicated.
Please allow me the pleasure of leading you on an 'office safari', so to speak. On today's journey we'll travel the corridors of computerdom in search of the widespread but elusive mal managerium, or bad manager, in common parlance. They will be difficult to spot because we will be in a sense looking for that most elusive creature of all: ourselves. That is to say, it's quite possible that many of us will share some of the qualities with the various types of bad managers we shall encounter. Qualities that we are loath to admit we possess, I might add.
It's been a hard day's night--proving nonexistence!
The thorough use of internal documentation is one of the most-overlooked ways of improving software quality and speeding implementation.
The Jeremiahs of the software world are out there lamenting, "Software is buggy and insecure!" Like the biblical prophet who bemoaned the wickedness of his people, these malcontents tell us we must repent and change our ways. But as someone involved in building commercial software, I'm thinking to myself, "I don't need to repent. I do care about software quality." Even so, I know that I have transgressed. I have shipped software that has bugs in it. Why did I do it? Why can't I ship perfect software all the time?
Is programming language a misnomer? Many linguists are still busy trying to reconstruct the single ur-language presumed to have evolved over untold millennia into the thousands of human tongues - alive and dead, spoken and written - that have since been catalogued and analyzed. The amazing variety and complexity of known languages and dialects seems, at first parse, to gainsay such a singular seed.
When I was studying French in high school, we students often spoke "Franglais": French grammar and words where we knew them, English inserted where our command of French failed us. It was pretty awful, and the teacher did not think highly of it. But we could communicate haltingly because we all had about the same levels of knowledge of the respective languages.
I usually shun clichés like the plague, but could not resist this oft-quoted slogan that sums up what I like to call Psephological Cynicism. Psephology is the huge and growing branch of mathematics (with frequent distractions from sociologists, psychologists, political scientists, and allied layabouts) that studies the structure and effectiveness of polling and electoral strategies. Related domains include probability and games theory, although, as well see, the subject has many far-from-playful implications. Indeed, there are depressing but valid proofs that no voting system fully guarantees "fair play."
N-streak, 1-streak, worra streak
Science fiction seems to have spawned two divergent subgenres. One, which is out of favor, paints a bright future for us, assuming an optimistic, Darwinian "perfectability." These scenarios project an ever-expanding (or rather, a never-imploding) cosmos with ample time for utopian evolutions.
I've been responsible for hiring many software engineers. I tend to ask lots of elaborate technical questions so I can really get to know how the candidate thinks and works with me while solving hard problems. QA (quality assurance) engineers will appreciate this one (it's a "negative test" for intellectual honesty): "Explain the relative strengths and weaknesses of FreeBSD, Windows NT, Solaris, and Linux."
Nowadays, when you find yourself utterly disgusted by "American Idol," or any other of the latest "reality" shows on TV, you may decide, "What the heck, time to seek a slightly less horrible form of punishment: let's get on the Web."
I remind you, first, that "damnéd" has two syllables, calling for a Shakespearean sneer as sneered by Olivier strutting his King Richard III stuff.
OK, so I admit it - not only am I a total closet gamer geek, I admit that I actually care enough to be bitter about it. Yep, that's right - this puts me in the "big-time nerd" category.
You know what I hate about spam filtering? Most of what we do today hurts the people who are already being hurt the most. Think about it: Who pays in the spam game? The recipients. That's what's wrong in the first place - the wrong folks pay for this scourge.
Respected technology commentators say that they now prefer instant messaging (IM) over e-mail as their medium of choice for computer-mediated communication.1 The main reasons are that e-mail has become an overloaded channel for readers and that you can’t be sure to get a timely response from the recipients of your e-mail.
You know what bugs me about wireless networking? Everyone thinks it's so cool and never talks about the bad side of things. Oh sure, I can get on the 'net from anywhere at Usenix or the IETF (Internet Engineering Task Force), but those are _hostile_ _nets_. Hell, all wireless nets are hostile. By their very nature, you don't know who's sharing the ether with you. But people go on doing their stuff, confident that they are OK because they're behind the firewall.