Curmudgeon

  Download PDF version of this article PDF

From This Moment On

Divining the future of computers with computers

Stan Kelly-Bootle, Author

Science fiction seems to have spawned two divergent subgenres. One, which is out of favor, paints a bright future for us, assuming an optimistic, Darwinian “perfectability.” These scenarios project an ever-expanding (or rather, a never-imploding) cosmos with ample time for utopian evolutions.

On the other hand, the more popular sci-fi projections have been dubbed “Things will just keep getting worse.” Here, the authors spot a nasty trend or two and extrapolate away to Malthusian doom, ignoring the glut of counterexamples. Oft cited is the confident prediction made by Victorian planners in 1830 that by 1930 London street traffic would be bogged [sic] down under 25 feet of horse manure. The modern eco-seers are unfazed. In 1971 best-selling author Richard A. Falk warned us, “The risk of disaster will rise continually in the years ahead, and barring some colossal human intervention, a disaster of catastrophic proportions is likely to occur in the 1990s.”1 Cynics might claim that the decade’s Java pandemic confirms Falk’s forebodings. Undeterred by past performance, my study of spam growth indicates that by next year 80 percent of all e-mails will invite you to smuggle $2 million out of Nigeria. The remaining 20 percent will help improve your love life with the smuggling profits.

I should also mention a bizarre sci-fi category that combines the two extremes of disaster and optimism: the popular Left Behind series by Tim Lahaye and Jerry B. Jenkins, with titles such as Desecration and Tribulation Force. These books retell the apocalypses of Daniel and Revelation, leading, after much gnashing and lamentation, to victory for the Elect over Antichrist after Armageddon. ACM Queue readers, regardless of cherished doctrine, will enjoy the role of computers as the End Times loom. Each side has the latest [sic] communications technology, and there are both evil and good hackers intercepting and disrupting the other’s e-mail and Web logs. No explicit URLs are given, leaving us to guess the Holy and Unholy Domains: Are they god.org, savior.net, mammon.biz?

The success of Left Behind confirms Stephen Mitchell’s comment: “There is such a thing as nostalgia for the future. Both Judaism and Christianity ache with it.”2 And Jaroslav Pelikan notes the nervous laugh that betrays the deepest anxieties of any society: there’s no topic like eschatology (from Greek ta eschata, “the last things”) to trigger angst3—to the extent that some German scholars, anxious to avoid the subject, have subjected the scriptures to what Pelikan calls enteschatologisierung (or less impressively in English, “de-eschatologization”). Did Kantian software theologians support the GoToLess crusade waving an Entunternehmangenausstreckenisierung banner?

Speaking of Evil Empires reminds us of some misguided prophecies in our own fair IT trade (which I prefer to pronounce as the monosyllablic “IT,” a showbiz synonym for sex and oomph). An example in the news disproves the dinosaur-mainframe predictions, some of which emerged within IBM itself when it launched the PC in 1981, violating Grosch’s law.4 Big Blue is celebrating the 40th anniversary of the panoramic 360 with ever-healthy sales of its many successors. Indeed, the Seven Dwarfs we joked about in the 1950s and ’60s (Univac as Doc; CDC as Grumpy; ICL as Sleepy; your choice of Dopey) are fatally diminished, leaving IBM with a virtual (and profitable) monopoly of Big Iron. Alas, it let its share of the PC market slip away into the hands of upstarts. Further, by one of those unpredictable quirks of fate, IBM’s choice of a fledgling OEM MS-DOS bred a monster. As Mae West once said, “I used to be Snow White but I kinda drifted.”

Moving from the anecdotal, amateur “cherry-picking”5 approach to futurology, we address the logical foundations of “what’s next?” We can, for a change, avoid the arid domain of formal logic. Nolt’s Informal Logic,6 for example, provides various argument-diagramming techniques (not unlike the charting tricks of Unified Modeling Language) that help you to spot faulty logic. In this “practical” study of everyday deduction and induction, we go beyond the classical syllogisms of Aristotle (Some non-crows are non-white!) but shun the paradoxical netherworlds of Smullyan (I cannot tell a lie: all my answers are false!). Rather, Nolt offers delighful examples that you must analyze and reduce to the standard argument structure: “a sequence of declarative sentences, one of which, called the conclusion, is intended to be evidentially supported by the others, called premises.” (Resist the old quip that neighbors can never agree since they are arguing from different premises.) Thus, out with your templates and diagram: “If ever, oh ever a Wiz there was / The Wizard of Oz is one becoz / Becoz, becoz, becoz, becoz, becoz / Becoz of the wonderful things he does.”7

We can certainly sharpen our deductive powers and more readily spot the daily political non sequiturs. But in our complex, nondeterministic world, can we do better than the various clairvoyant rituals with the fancy “-mancy” suffix? Plucking and blowing his nose-hairs did well for “Nostrildamus.” Less painful but more expensive is our modern bitomancy, whereby computer models divine the future from the fleeting state of our registers.

REFERENCES

1. Falk, R. A. This Endangered Planet. Vintage Books, New York: NY, 1971.

2. Mitchell, S. The Gospel According to Jesus. HarperCollins, New York: NY, 1991.

3. Pelikan, J. The Melody of Theology: A Philosophical Dictionary. Harvard University Press, Cambridge: MA, 1988, p. 74.

4. Formulated by Herbert R. Grosch in the 1940s; Computing power is proportional to the square of the system cost. Mainframers cunningly validate P = kC^2 for microprocessors by setting k=0 for all values of C.

5. Gaining popularity: cherry-pick, vt, “to select those propositions from a large corpus that support your prejudices.”

6. Nolt, J. E. Informal Logic: Possible Worlds and Imagination. McGraw-Hill, New York: NY, 1984.

7. Arlen, H., and Harburg, E. Y. “We’re off to See the Wizard.” 1939. Some of the “becoz’s” are redundant.

 

STAN KELLY-BOOTLE, born in Liverpool, England, read pure mathematics at Cambridge in the 1950s before tackling the impurities of computer science on the pioneering EDSAC I. His many books include The Computer Contradictionary (MIT Press, 1995) and Understanding Unix (Sybex, 1994). Under his nom-de-folk, Stan Kelly, he has enjoyed a parallel career as singer and songwriter.

© 2004 ACM 1542-7730/04/0600 $5.00

acmqueue

Originally published in Queue vol. 2, no. 4
Comment on this article in the ACM Digital Library





More related articles:

Nicole Forsgren, Eirini Kalliamvakou, Abi Noda, Michaela Greiler, Brian Houck, Margaret-Anne Storey - DevEx in Action
DevEx (developer experience) is garnering increased attention at many software organizations as leaders seek to optimize software delivery amid the backdrop of fiscal tightening and transformational technologies such as AI. Intuitively, there is acceptance among technical leaders that good developer experience enables more effective software delivery and developer happiness. Yet, at many organizations, proposed initiatives and investments to improve DevEx struggle to get buy-in as business stakeholders question the value proposition of improvements.


João Varajão, António Trigo, Miguel Almeida - Low-code Development Productivity
This article aims to provide new insights on the subject by presenting the results of laboratory experiments carried out with code-based, low-code, and extreme low-code technologies to study differences in productivity. Low-code technologies have clearly shown higher levels of productivity, providing strong arguments for low-code to dominate the software development mainstream in the short/medium term. The article reports the procedure and protocols, results, limitations, and opportunities for future research.


Ivar Jacobson, Alistair Cockburn - Use Cases are Essential
While the software industry is a fast-paced and exciting world in which new tools, technologies, and techniques are constantly being developed to serve business and society, it is also forgetful. In its haste for fast-forward motion, it is subject to the whims of fashion and can forget or ignore proven solutions to some of the eternal problems that it faces. Use cases, first introduced in 1986 and popularized later, are one of those proven solutions.


Jorge A. Navas, Ashish Gehani - OCCAM-v2: Combining Static and Dynamic Analysis for Effective and Efficient Whole-program Specialization
OCCAM-v2 leverages scalable pointer analysis, value analysis, and dynamic analysis to create an effective and efficient tool for specializing LLVM bitcode. The extent of the code-size reduction achieved depends on the specific deployment configuration. Each application that is to be specialized is accompanied by a manifest that specifies concrete arguments that are known a priori, as well as a count of residual arguments that will be provided at runtime. The best case for partial evaluation occurs when the arguments are completely concretely specified. OCCAM-v2 uses a pointer analysis to devirtualize calls, allowing it to eliminate the entire body of functions that are not reachable by any direct calls.





© ACM, Inc. All Rights Reserved.