Mache Creeger's general pessimism1 about IT's status quo rests on his perception that HiTech (the character- and tree-saving token for High Technology, somewhat, if not totally, vitiated by this long-winded, unnecessary explanation) is not quite Hi enough. IT relies too much on dreary, evolutionary gradualism rather than on the exciting Kuhnian discontinuities that spell revolution and paradigm shifts.2
I have no qualms about Creeger's observation that the marketeers, both commercial and academic (if such categories can be distinguished in these pursy PC times), are fond of paint jobs - coloring the most modest upgrades with claims of major, must-have breakthroughs. This is an ancient and, alas, effective promotional ploy in other trades. I recall one cornflake manufacturer who was forced to confess that what was "new" about its latest product was the bold slogan "NEW" on the package.
Whether there is, in fact, as Creeger implies, a true technical dichotomy between evolution and revolution depends on your definitions and to which of the diverse "philosophies of science" you subscribe. We must first clear our minds of those old misleading devils, etymologies, known to us serious linguists as "just-so stories." Stanford professor Seth Lerer3 tells of the misguided medieval scholars who derived intrinsic properties of the "real world" from the terms used to describe such properties. They saw deep significance in the Latin words for "wood" (lignum) and "fire" (ignis), from which any pre-Baconian fool can see that the essence of fire is physically trapped inside the wood. More playfully, my favorite 20th-century novelist, Vladimir Nabokov,4 supports his world view by noting that "comic" is a major component of "cosmic"! James Joyce played the same game in Finnegans Wake (no apostrophe, please) by transforming "funeral" to sound like "fun-for-all."
Semantically, "evolution" is similarly misleading as a major substring of "revolution." The two words do, however, share a genuine common Latin root: volvere, "to roll," with related idiomatic meanings such as "to roll over; to wrap up; to grovel."5 The derived verb evolvere ("to unroll; to unfold") is helpful in fixing our current usages, but with revolvere, the original prime meanings, "to roll back, to rotate," are somewhat contrary to "revolution" in our current context. Certainly, in the familiar political arena, the true revolution will achieve its curdling share of "rolling back," to remove traces of previously accrued errors and evils (the editor's equivalent of UNDO6). Yet, much "rolling forward" is also expected, building new dreams on the ashes of the old. Contrast this with the cynic's view of Fabianism, a movement seeking a "gradualist," nonrevolutionary, bloodless path to Socialism: "We'll change the country bit by bit, so nobody will notice it!" ("Pink Flag," Leon Rosselson).
Creeger seems to follow the rather loose, everyday, contrastive use of "evolution" (slow refinement of the old) and "revolution" (new, unexpected breakthrough with wide impact). He bemoans the lack of genuine CS and IT "innovation" since he graduated some 30 years ago: "... the vast majority of new products are evolutionary, not revolutionary...most new products are just a recycling of an earlier idea. The [R&D] effort seems to go into refinement, not into changing the core paradigm."7
Paradigm! Now there's a word to arouse ye starveling etymologists from your slumbers.8 It's a while since "paradigm" lost its fashionable edge through misuse and overexposure. (Some words, like shameless guests, just seem to outstay their welcome, as when the Queen, entertaining after a late, royal dinner, whispered to the King, "Don't these people have palaces to go to?") Certainly by the 1980s the term was attracting abuse and satire, as with the parody, "Brother, Can You Spara Digm?"9 I invented, unfairly, the SI (International System of Units) unit swaine to measure the density of "paradigm" occurrences per square centimeter of the printed page. Extra marks were awarded for "multiparadigmatic." Michael Swaine, still flaming brilliantly for Dr. Dobb's, took the hint in good spirit. The record, in passing, belongs to Gerry Reid of IBM. His 13-page booklet, "Changing Perceptions - the Power of Paradigms," has 127 mentions of the dreaded eight-letter word.
Part of the paradigm-usage problem stems from its semantic drift over the years, ending, ironically in a dramatic shift triggered by (whom else?) Kuhn himself.10 The original Greek is innocent enough: simply the showing of things side by side, whence the use of "paradigm" as a model or pattern for grammatical declensions and conjugations.
More sinister cognates creep into the Koine Greek of the New Testament, where we meet the verb paradeigmatizo. Here the sense is of setting a bad example or holding up to scorn ("Because Joseph her husband was a righteous man and did not want to expose her [Mary] to public disgrace..." Matthew 1:19; see also Hebrews 6:6).
Kuhn's "paradigm" subsumes the notion of "scientific model," but he refines it in order to exegete a new philosophy of scientific change. He contrasts the older tradition of science gradually and rationally converging/evolving to some objective "truth" with what he observed as the way science actually progresses via revolutionary discontinuities known as "paradigm shifts." You can sniff a whiff of the self-referential controversy. For example, Nicholas Wade's review (Science magazine) of Kuhn's magnum opus:
"...Kuhn wastes little time on demolishing the logical empiricist view of science as an objective progression toward the truth. Instead he erects from the ground up a structure in which science is seen to be heavily influenced by nonrational procedures, and in which new theories are viewed as being more complex than those they usurp but not as standing any closer to the truth."
But the sting is in Wade's conclusion:
"Since Kuhn does not permit truth to be a criterion of scientific theories, he would presumably not claim his own theory to be true."
Followed by the un-sting (which earned Wade a place in the book-jacket's blurbs):
"But if causing a revolution is the hallmark of a superior paradigm, [this book] has been a resounding success."
It should be clearer, after these digressions, why the use of "paradigm" was oft berated where a less pretentious "model" would serve. Your Ptolemaic and Einsteinian paradigms are admissible in Kuhnian dissertations, but elsewhere...
"I have recently shifted from the 8088-with-640K-RAM paradigm to the 80386-with-4-Mb-RAM paradigm, scorning all intermediate paradigms. If you shop around, you'll find that some emporia are willing to sell you the equipment and throw in the appropriate paradigm at no extra charge. However, don't expect much of a trade-in on your old paradigm: 'I hope you won't take offense, Sir, but, really...O dear me...we haven't seen one of those in years...I doubt if it's even listed in Kuhn's Blue Book. Have you tried the Smithsonian?'"11
Ella Fitzgerald's old torch song, "There'll Be Some Changes Made," received a timely HiTech makeover. Formerly scatted as
There'll be a change in the weather, a change in the sea.
From now on, there'll be a change in me.
I'll change my way of walking, my talk and my name.
Nothin' about me will be quite the same ...
O Lord, there'll be some changes made today.
There'll be some changes made!
it now goes with some scansional improvisation:
There'll be a meteorological paradigm shift.
Similarly, an oceanographic paradigm shift.
As of this memo, please note that paradigms are also to be shifted in the ambulatory, sociolinguistic, and onomastic environments.
The class of paradigmatically unshifted objects will soon be empty.
O Lawd, there'll be some paradigm shifts implemented today! Hey, hey!
We also found the chipmakers responding at the instruction level:
[bold]LSPR Dm,#n[bold] Logical Shift Paradigm Right - shift the paradigm in register Dm n places to the [political] right
with similar instructions for left shifts and rotations. Note that careless shifting may lose significant bits of your paradigm, and in the worst (best) case you could lose your paradigm completely.
We can now return to Creeger's argument that we urgently need a hefty shift in our "core paradigm" rather than hyped tinkering with features and packaging. His vantage point deserves respect. As a tech industry veteran-turned-consultant, he now meets first-hand and advises people on many potential innovations before they hit the fan or return to the drawing board.
Creeger's company name is significant: Emergent Technology Associates. "Emergent" has more class and clout than the cliched "Emerging." "Emergent" has signs of early-stage HiTech promise, while "Emerging" seems more like a careless rush for the exit.
But are there CS or IT "core paradigms" awaiting shifts in the Kuhnian sense? And can these shifts be hurried along from a pulpit? The prime example, from Newtonian to Relativitistic, came about not from a sermon that poor old boring Sir Isaac was past his shelf-life, but as a result (albeit, reluctantly) of accrued, measured discrepancies between theory and actual observations. One would need, perhaps, widespread dissatisfaction - nay, a pocketbook rebellion among users against their present gently improving gadgets - in order to trigger a true "core paradigm" shift. What we do see is fierce competition based on refinement and "featurism" and some customer reluctance against over-rapid change and early obsolescence. I note that "future-proof" has reached the IT lexicon!
Creeger lists some genuine recent revolutions, "including cellphones, GPS, quantum computing, encryption, and global access to content." Each of these, I reckon, in their own ways, could be viewed as evolutionary refinements on earlier technologies but with unpredicted, revolutionary impacts on our modus operandi. I would add nanotechnology and, just in this month, as it were, the emergent "potentially revolutionary" science of synthetic biology (Google the iGEM, not IGEM, project).
Finally, to remove all confusion, consider "evolution" as the major scientific "revolution." Darwin wrote near the end of his The Origin of Species:
"Although I am fully convinced of the truth of the views given in this volume... I, by no means expect to convince experienced naturalists whose minds are stocked with a multitude of facts all viewed, during a long course of years, from a point of view directly opposite to mine... But I look with confidence to the future - to young and rising naturalists, who will be able to view both sides of the question with impartiality."
STAN KELLY-BOOTLE (http://www.feniks.com/skb/, http://www.sarcheck.com), born in Liverpool, England, read pure mathematics at Cambridge in the 1950s before tackling the impurities of computer science on the pioneering EDSAC I. His many books include The Devil's DP Dictionary (McGraw-Hill, 1981), Understanding Unix (Sybex, 1994), and the recent e-book Computer Language - The Stan Kelly-Bootle Reader (http://tinyurl.com/ab68). Software Development Magazine has named him as the first recipient of the new annual Stan Kelly-Bootle ElecTech Award for his "lifetime achievements in technology and letters." Neither Nobel nor Turing achieved such prized eponymous recognition. Under his nom-de-folk, Stan Kelly, he has enjoyed a parallel career as a singer and songwriter.
Originally published in Queue vol. 4, no. 6—
see this item in the ACM Digital Library
Andre Charland, Brian LeRoux - Mobile Application Development: Web vs. Native
Web apps are cheaper to develop and deploy than native apps, but can they match the native user experience?
Stephen Johnson - Java in a Teacup
Programming Bluetooth-enabled devices using J2ME
- Streams and Standards
Don’t believe me? Follow along… Mobile phones are everywhere. Everybody has one. Think about the last time you were on an airplane and the flight was delayed on the ground. Immediately after the dreaded announcement, you heard everyone reach for their phones and start dialing.
Fred Kitson - Mobile Media
Many future mobile applications are predicated on the existence of rich, interactive media services. The promise and challenge of such services is to provide applications under the most hostile conditions - and at low cost to a user community that has high expectations. Context-aware services require information about who, where, when, and what a user is doing and must be delivered in a timely manner with minimum latency. This article reveals some of the current state-of-the-art "magic" and the research challenges.