Curmudgeon

  Download PDF version of this article PDF

As Big as a Barn?

Taking measure of measurement

Stan Kelly-Bootle, Author

The Texas rancher is trying to impress the English farmer with the size of his property. “I can drive out at dawn across my land, and by sundown I still haven’t reached my ranch’s borders.” The Englishman nods sympathetically and says, “Yes, yes, I know what you mean—I have a car like that, too.”

 A joke should seldom be subjected to explanatory dissection, but here I seek your indulgence. As with all but the most blood-crazed lepidopterists, a few thoraxes must be pinned and a few wings plucked—some butterflies, even the rarest Red Admirables,1 must pay the ultimate price in the cause of entomological advancement. The etherized joke has two obvious clichéd elements followed by an unexpected punch line. The stereotypical Texan boasting of ownership and size is always good for a cheap laugh in smaller, jealous states and nations. The proof of size, though, is not revealed in normal units of distance or area, but as the time taken to traverse the property. Since d = v x t (distance = velocity x time), the inference is that for a given constant v, the journey time, t, does in fact provide a reasonable, comparative unit for the distance d. This notion is suddenly overturned, however, by that annoyingly smug, modest, ironic, pipe-puffing Brit. The Texas hype is literally inverted: the equation is rewritten as t = d/v, and the implied reason it takes 12 hours to cross the ranch is not the huge, boastful value of d but the pathetically low value of v, the top speed of the car!

Having drained the joke dry—and wiped any residual smile from your chops—we can proceed to the main thrust of my sermon: the curse of metrics and the dangers of their bastard offspring, units. How long is a piece of string? The answer in the arithmetic register is 42. But 42 what? Name your units: meters, inches, rods, poles, or perches?2 (If it’s a string of characters, length 42 may be a pure unit-less count, possibly “off by one!”) The choice of units may not matter if you are interested only in comparing string lengths. If your choice of units is consistent, equality and greater/less-than are unit-free relations (with the usual caveats on precision). Texas hats are bigger regardless of hat size! Elsewhere, interpreting 42 as 42 particular units of whatever is vital, and, you’ll hardly need to be reminded, careless arithmetic on mixed units is known to be a danger to life and budget.

The SI (Système Internationale d’Unités) has created an almost universal set of standard units starting with the three basic elements of distance, mass, and time, often known as the metric or MKS system after the units: meter (m), kilogram (kg), and second (s). Most physical properties can be derived from these three. Thus, force, as mass times acceleration, boils down to a unit characterized dimensionally as N=kg·m/s2, known as the newton to honor a major force in physics. Older unit systems survive, such as the nonmetric Imperial (think British Empire) based on the foot-pound-second, as the God who made us mighty intended. Resistance to change is not entirely chauvinistic—there are billions of nuts, bolts, matching spanners, and marked sports fields out there. There are also prosodic objections since Shakespeare’s “Full fathom five thy father lies” loses in the scanning and alliterative departments as “Full meters 9.144 thy father lies” (a fathom being six feet). Tennyson’s call to the Light Brigade becomes “2.414016 kilometers, 2.414016 kilometers onward, All in the valley of Death, Rode the six hundred.” (Half a league is 1.5 miles.)

The dramatic and costly dangers of mixing units emerged in October 1999. Even after much embarrassing publicity and analysis, the failure of the Mars orbiter remains a startling example of a blindingly simple error creeping past millions of dollars worth of attention to detail. No doubt every warning and trap discussed over the millennia in the Peter G. Neumann (PGN) Risks Forum was familiar to the teams at Lockheed Martin and Caltech. You can’t even say that one team was right and the other wrong. All that was lacking was someone or something to ensure coordination over possibly a single line of code buried deep in the program’s nested bowels.

I’ve been profitably reading PGN since before the big bang, and I’ve often thought, “How bleeding obvious—nobody could possibly have done that!” Yet, I suppose PGN never thought it necessary to write, “When sending a signal to apply N units of force to correct the orbit of a $125 million Mars polar orbiter, double-, treble-check that the units of force referred to by sender and receiver are the same.” There are subtle ironies. Although one newton is 7.2330 poundals, which sounds like a fatal discrepancy, the forces needed for orbital velocity adjustments of millimeters/second were actually quite tiny. On this longer-than-usual mission, the tiny discrepancies built up, but the ship was still, I’m told, a hairsbreadth (a kuntzhaar, as the Afrikaaner metrologists say) out on the wrong path before disintegrating in the Martian atmosphere.

Another possible unit-mismatch is harder to verify. Seismologist Zhonghao Shou claimed a better private prediction (via observed geothermal activity) for an Alaskan earthquake that occurred on May 19, 2000. The Pasadena station disagreed with Shou’s calculations, a problem exacerbated by differences in the way earthquake databases are set up at different sites. Shou defended his results by showing that his data matched that of the Berkeley station. To which the Pasadena spokesdoctor replied: “Berkeley are wrong. Their data are always bigger than ours.” (See http://quake.exit.com/A000521.html.)

Digging deeper but briefly, there’s more to metrics than tape measures, weighing machines, and clocks. Not all topological spaces are strictly metrizable and even if proved to be metrizable, the metric may be unknown.3 Measuring a programmer’s real output or a program’s structural complexity, for example, oft stretches the mathematical meaning of metric. Counting the occurrences of the string metric in a brochure, however, seems to offer a sound metric for the product’s marketability.

Returning to our Texan tale, consider the related model whereby we measure astronomical distances in light-years: we take v as the constant (as far as we know) c, the velocity of light in vacuo, and then reckon t in the years it takes light to travel distance d. Amusingly (or not, depending on your level of seriousness), it’s the source of many a slip. In at least two sci-fi novels by well-versed scientists (cosmologist Fred Hoyle and mathematician Doug Whittaker) we find spatial units treated temporally (or to avoid ambiguity, as temporal units). Thus, we encounter people waiting a given number of light-years for something to happen!

Of course, the “year” is a pain to pin down. Even if we confine ourselves to terrestrial years, the subject of close scrutiny over the millennia, we end up with the arbitrary IAU (International Astronomical Union) choice of the Julian calendar year of 365.25 days, each day being taken as exactly 24 hours. This mishmash of ancient units gives us the “reference year” as exactly 31,557,600 seconds, hardly a fundamental constant in the mind of God or even Steven Hawking. The value of c, the speed of light, is known with uncanny accuracy. This magically fundamental c is just a constant occurring in Maxwell’s equations that happens to have the dimension d/t (velocity). Eddington used to claim that its value could be calculated by a blind-from-birth physicist working in a coal cellar. Anyway, given c and the reference year, the standard light-year comes out as 9.460,730,472,581 Pm (petameters, where peta means 1015).

In fact, the SI unit of time is now based on a more precisely measurable natural phenomenon, the behavior of the cesium-133 atom. The timed second is the duration of a certain number of state transitions of the cesium atom’s radiation. And, importantly, the meter no longer depends on the earth’s size or the length of an iron bar stored in Paris, but is given as the length traveled in a given time—that time being measured in the aforementioned standard second. These refinements do not actually change the values of the second or the meter, but provide more accurate and stable comparisons. The SI unit of mass remains old-fashioned: a 117-year-old lump of platinum-iridium alloy called the IPK (International Prototype Kilogram). But not for much longer. Two new, more fundamental methods of defining the kilogram are emerging (See “Weighty Matters,” by Ian Robinson. Scientific American, December 2006).

So, the meter is now patterned on the light-year model (distance traveled by light in a given time). In fact, this notion plays a role in all decent histories of computing: consider the amazing Grace Hopper’s famous nanosecond trick. The admirable U.S. Navy Commander (later Admiral) and Cobol inventor would hand out foot-long (12 inches) pieces of wire during lectures, explaining that they represented light-nanoseconds. (Yes, I did meet the formidable lady while I was with Univac, but I never directly saw the wire trick.)

The vagueness of the light-year’s derivation (what’s a year?) has led to a preference in astronomy for measuring distances in parsecs. The parsec is derived from the more natural (at least for astronomers) angular measure of an object’s parallax (its apparent displacement as the observer moves around). Thus, a star one parsec away has the parallax of one arc second (where second is confusingly an angular measure with strong etymological ties with time). The poor parsec is a mere 3.262 light-years; the more convenient unit is the familiar megaparsec.

It’s time to scrap the boring SI système with its hateful French Bonapartist influence and spelling. To rub salt in the wound, we propose a more historically colorful variation of the Imperial units. The FFF (furlong/firkin/fortnight) system replaces MKS with the following easy-to-remember table:

1 furlong = 1/8 mile (201.168 meters)

1 firkin = the mass of 9 Imperial gallons of water (40.91481 kilograms)

1 fortnight = 14 days of 24 hours (1,209,600 seconds)

The speed of light becomes a memorable 1.8 terafurlongs/fortnight.

Elsewhere, there’s no end of fanciful units to distract us. Areas are given as so many “football fields,” while British heights are based on Lord Nelson’s Column in Trafalgar Square. The unit of beauty depends on Christopher Marlowe’s Helen whose sweet face “launched a thousand ships.” The milliHelen is therefore the amount of beauty that can launch one ship. A recent elaboration is that ugliness can be expressed as negative Helen values, as when we say his or her face could sink a battleship. Less familiar might be the Finnish unit of computer-system support, the mikrotuki, coined at Nokia’s Helsinki labs. This relies on the verbal quirk that mikrotuki, meaning “microcomputer support,” could be read as 1 millionth of a tuki. Thus, a pikotuki is a thousandth of a nanotuki, etc.

But, saving the best for the last, I put forth the unit of effective cross-sectional area during nuclear collisions known as the barn, which is equal to about 10(-28) square meters. It’s the only unit name I know of that was initially classified by the U.S. Department of Defense! My chemist friend Ken Ashcroft cited the barn as proof that nuclear physicists were endowed with a whimsical weirdness, to which I replied: The term barn does seem to have a touch of whimsy for such an apparently tiny area, but the quirk runs in the opposite direction. We happen to know the two guys who coined the unit name, where, when, and why. And the story has enormous significance in world history for it spells out the birth of the atomic bomb and the development of nuclear energy—which is why the term was not declassified until 1948.

I claim that no physical unit has ever been so appropriately named as the barn! In fact, the barn represents an incredibly huge area in terms of other particle-physics units—so much larger than anticipated that, if anything, the barn can be rated as a deliberate understatement!

Purdue physicists Marshall Halloway and Charles Baker, the coiners of the term, reported their internal debates over dinner in the Purdue Memorial Union before settling for the barn4 to describe the effective cross-section area of the uranium nucleus during collisions. Earlier discarded options were the Oppenheimer, the Bethe, and the Manley after well-known members of the Los Alamos group. The decision was made when one of them looking at the number 10(-24) square centimeters exclaimed that for nuclear processes that really was as “big as a barn!”

To stress the size of the barn (as in the insult, “He couldn’t hit a barn with a bleedin’ banjo”), the most common subunit in practice is the femtobarn (10(-43) square meters), which is unimaginably smaller! To get a femtobarn, just divide the barn by 1000,000,000,000,000.

References

  1. Vladimir Nabokov’s favorite Vanessa atalanta. Whether Red Admirable is a playful version of Red Admiral or vice versa is uncertain.
  2. Gotcha! The rod, pole, and perch each represents the same, presumably important length, namely 16.5 feet, 5.5 yards, or 0.25 chains. The chain is a fundamental, universal constant—namely, the length of a standard cricket pitch. Cricket is a game formerly played in the UK but now confined to her colonies.
  3. http://mathworld.wolfram.com/MetrizableTopology.html. Incidentally, the overblown use of the term topology when describing simple network layouts is not strictly incorrect but still damned irritating to real topologists.
  4. Halloway, M., Baker, C. 1972. Physics Today (July). The barn is not the kind of area with which to measure your curtains. In fact, B = R/I, where I is the number of incident particles per unit time per unit area; and R is the number of reactions per unit time per nucleus. R/I clearly has dimension d2 (area) = [(t-1)/((t-1)*(d-2)) where t is time].

STAN KELLY-BOOTLE (http://www.feniks.com/skb/; http://www.sarcheck.com), born in Liverpool, England, read pure mathematics at Cambridge in the 1950s before tackling the impurities of computer science on the pioneering EDSAC I. His many books include The Devil’s DP Dictionary (McGraw-Hill, 1981), Understanding Unix (Sybex, 1994), and the recent e-book Computer Language—The Stan Kelly-Bootle Reader (http://tinyurl.com/ab68). Software Development Magazine has named him as the first recipient of the new annual Stan Kelly-Bootle Eclectech Award for his “lifetime achievements in technology and letters.” Neither Nobel nor Turing achieved such prized eponymous recognition. Under his nom-de-folk, Stan Kelly, he has enjoyed a parallel career as a singer and songwriter.

 

acmqueue

Originally published in Queue vol. 5, no. 2
Comment on this article in the ACM Digital Library





More related articles:

Nicole Forsgren, Eirini Kalliamvakou, Abi Noda, Michaela Greiler, Brian Houck, Margaret-Anne Storey - DevEx in Action
DevEx (developer experience) is garnering increased attention at many software organizations as leaders seek to optimize software delivery amid the backdrop of fiscal tightening and transformational technologies such as AI. Intuitively, there is acceptance among technical leaders that good developer experience enables more effective software delivery and developer happiness. Yet, at many organizations, proposed initiatives and investments to improve DevEx struggle to get buy-in as business stakeholders question the value proposition of improvements.


João Varajão, António Trigo, Miguel Almeida - Low-code Development Productivity
This article aims to provide new insights on the subject by presenting the results of laboratory experiments carried out with code-based, low-code, and extreme low-code technologies to study differences in productivity. Low-code technologies have clearly shown higher levels of productivity, providing strong arguments for low-code to dominate the software development mainstream in the short/medium term. The article reports the procedure and protocols, results, limitations, and opportunities for future research.


Ivar Jacobson, Alistair Cockburn - Use Cases are Essential
While the software industry is a fast-paced and exciting world in which new tools, technologies, and techniques are constantly being developed to serve business and society, it is also forgetful. In its haste for fast-forward motion, it is subject to the whims of fashion and can forget or ignore proven solutions to some of the eternal problems that it faces. Use cases, first introduced in 1986 and popularized later, are one of those proven solutions.


Jorge A. Navas, Ashish Gehani - OCCAM-v2: Combining Static and Dynamic Analysis for Effective and Efficient Whole-program Specialization
OCCAM-v2 leverages scalable pointer analysis, value analysis, and dynamic analysis to create an effective and efficient tool for specializing LLVM bitcode. The extent of the code-size reduction achieved depends on the specific deployment configuration. Each application that is to be specialized is accompanied by a manifest that specifies concrete arguments that are known a priori, as well as a count of residual arguments that will be provided at runtime. The best case for partial evaluation occurs when the arguments are completely concretely specified. OCCAM-v2 uses a pointer analysis to devirtualize calls, allowing it to eliminate the entire body of functions that are not reachable by any direct calls.





© ACM, Inc. All Rights Reserved.