Columns > The Bikeshed


      view issue

The Software Industry IS the Problem
Download PDF version of this article

by Poul-Henning Kamp | September 8, 2011

Topic: Privacy and Rights

  • View Comments
  • Print

The Software Industry IS the Problem

The time has come for software liability laws.


Poul-Henning Kamp


One score and seven years ago, Ken Thompson brought forth a new problem, conceived by thinking, and dedicated to the proposition that those who trusted computers were in deep trouble.

I am, of course, talking about Thompson's Turing Award lecture, "Reflections on Trusting Trust."2 Unless you remember this piece by heart, you might want to take a moment to read it if at all possible.

The one sentence in Thompson's lecture that really, REALLY matters is: "You can't trust code that you did not totally create yourself."

This statement is not a matter of politics, opinion, taste, or in any other way a value judgment; it is a fundamental law of nature, which follows directly from pure mathematics in the general vicinity of the works of Turing and Gödel. If you doubt this, please (at your convenience) read Douglas Hofstadter's classic Gödel, Escher, Bach, and when you get to the part about "Mr. Crab's record player," substitute "Mr. Crab's laptop."


Gödel, Escher, Bach

Hofstadter's book, originally published in 1979, does not in any way detract from Ken Thompson's fame, if, indeed, his lecture was inspired by it; 1979 was a long time ago, and it's possible that not every reader may know of—much less have read—this book. My editor proposed that I summarize or quote from it to make things clearer for such readers.

Considering that Gödel, Escher, and Bach are all known for their intricate multilayered works and that Hofstadter's book is a well-mixed stew not only of their works, but also of the works of Cantor, Church, Gantõr, Turing, and pretty much any other mathematician or philosopher you care to mention, I will not attempt a summary beyond: "It's a book about how we think."

The relevant aspect of the book here is Gödel's incompleteness theorem, which, broadly speaking, says that no finite mathematical system can resolve, definitively, the truth value of all possible mathematical conjectures expressible in that same mathematical system.

In the book this is illustrated with a fable about Mr. Crab's "perfect record player," which, because it can play any and all sounds, can also play sounds that make it resonate and self-destroy—a vulnerability exploited on the carefully constructed records of Mr. Crab's adversary, Mr. Tortoise.

Mr. Crab tries to protect against this attack by preanalyzing records and rearranging the record player to avoid any vulnerable resonance frequencies, but Mr. Tortoise just crafts the sounds on his records to the resonance frequencies of the part of the record player responsible for the rearrangement. This leaves Mr. Crab no alternative but to restrict his record playing to only his own, preapproved records, thereby severely limiting the utility of his record player.

Malware-scanning programs try to classify executable code into "safe" and "unsafe," instead of mathematical conjectures into "true" and "false," but the situation and result are the same: there invariably is a third pile called "cannot decide either way," and whatever ends up in that pile is either a security or a productivity risk for the computer user.

Amusingly, malware scanners almost unfailingly classify malware-scanner programs, including themselves, as malware, and therefore contain explicit exemptions to suppress these "false" positives. These exemptions are of course exploitable by malware—which means that the classification of malware scanners as malware was correct to begin with. "Quis custodiet ipsos custodes?" (Who will guard the guards themselves?)


Back to Thompson

In 1984, the Thompson lecture evoked wry grins and minor sweating for Unix system administrators at universities, because those were the only places where computers were exposed to hostile users who were allowed to compile their own programs. Apart from sporadic and mostly humorous implementations, however, no apocalyptic horsemen materialized in the sky.

In recent years, there have been a number of documented instances where open source projects were broken into and their source code modified to add backdoors. As far as I am aware, none of these attacks so far has reached further than the lowest rung on Ken Thompson's attack ladder in the form of a hardcoded backdoor, clearly visible in the source code. Considering the value to criminals, however, it is only a matter of time before more advanced attacks, along the line Thompson laid out, will be attempted.

The security situation with commercial closed-source software is anyone's guess, but there is no reason to think—and no credible factual basis for a claim—that the situation is any different or any better than it is for open source projects.

The now-legendary Stuxnet malware incident has seriously raised the bar for just how sophisticated attacks can be. The idea that a widely deployed implementation of Java is compiled with a compromised compiler is perfectly reasonable. Outsourced software development does not make that scenario any less realistic, likely, or scary.


We Have to do Something

We have to do something that actually works, as opposed to accepting a security circus in the form of virus or malware scanners and other mathematically proven insufficient and inefficient efforts. We are approaching the point where people and organizations are falling back to pen and paper for keeping important secrets, because they no longer trust their computers to keep them safe.


What Can We Do?

Ken Thompson's statement, "You can't trust code that you did not totally create yourself"—points out a harsh and inescapable reality. Just as we don't expect people to build their own cars, mobile phones, or homes, we cannot expect secretaries to create their own text-processing programs nor accountants to create their own accounting systems and spreadsheet software. In strict mathematical terms, you cannot trust a house you did not totally create yourself, but in reality, most of us will trust a house built by a suitably skilled professional. Usually we trust it more than the one we might have built ourselves, and this even when we may have never met the builder and/or when the builder is dead. The reason for this trust is that shoddy construction has had negative consequences for builders for more than 3,700 years. "If a builder builds a house for someone, and does not construct it properly, and the house which he built falls in and kills its owner, then the builder shall be put to death." (Hammurabi's Code, approx. 1700 BC)

Today the operant legal concept is "product liability," and the fundamental formula is "if you make money selling something, you'd better do it properly, or you will be held responsible for the trouble it causes." I want to point out, however, that there are implementations of product liability other than those in force in the U.S. For example, if you burn yourself on hot coffee in Denmark, you burn yourself on hot coffee. You do not become a millionaire or necessitate signs pointing out that the coffee is hot.

Some say the only two products not covered by product liability today are religion and software. For software that has to end; otherwise, we will never get a handle on the security madness unfolding before our eyes almost daily in increasingly dramatic headlines. The question is how to introduce product liability, because just imposing it would instantly shut down any and all software houses with just a hint of a risk management function on their organizational charts.


A Software Liability Law

My straw-man proposal for a software liability law has three clauses:

Clause 0. Consult criminal code to see if any intentionally caused damage is already covered.

I am trying to impose a civil liability only for unintentionally caused damage, whether a result of sloppy coding, insufficient testing, cost cutting, incomplete documentation, or just plain incompetence. Intentionally inflicted damage is a criminal matter, and most countries already have laws on the books for this.

Clause 1. If you deliver software with complete and buildable source code and a license that allows disabling any functionality or code by the licensee, then your liability is limited to a refund.

This clause addresses how to avoid liability: license your users to inspect and chop off any and all bits of your software they do not trust or do not want to run, and make it practical for them to do so.

The word disabling is chosen very carefully. This clause grants no permission to change or modify how the program works, only to disable the parts of it that the licensee does not want. There is also no requirement that the licensee actually look at the source code, only that it was received.

All other copyrights are still yours to control, and your license can contain any language and restriction you care to include, leaving the situation unchanged with respect to hardware locking, confidentiality, secrets, software piracy, magic numbers, etc. Free and open source software is obviously covered by this clause, and it does not change its legal situation in any way.

Clause 2. In any other case, you are liable for whatever damage your software causes when used normally.

If you do not want to accept the information sharing in Clause 1, you would fall under Clause 2 and have to live with normal product liability, just as manufacturers of cars, blenders, chainsaws, and hot coffee do. How dire the consequences and what constitutes "used normally" are for the legislature and courts to decide.

An example: A salesperson from one of your longtime vendors visits and delivers new product documentation on a USB key. You plug the USB key into your computer and copy the files onto the computer. This is "used normally" and should never cause your computer to become part of a botnet, transmit your credit card number to Elbonia, or send all your design documents to the vendor.

The majority of today's commercial software would fall under Clause 2. To give software houses a reasonable chance to clean up their acts and/or to fall under Clause 1, a sunrise period would make sense, but it should be no longer than five years, as the laws would be aimed at solving a serious computer security problem.

And that is it, really. Software houses will deliver quality and back it up with product liability guarantees, or their customers will endeavor to protect themselves.


Would it Work?

There is little doubt that my proposal would increase software quality and computer security in the long run, which is exactly what the current situation calls for.

It is also pretty certain that there will be some short-term nasty surprises when badly written source code gets a wider audience. When that happens, it is important to remember that today the good guys have neither the technical nor the legal ability to know if they should even be worried, as the only people with source-code access are the software houses and the criminals.

The software houses would yell bloody murder if any legislator were to introduce a bill proposing these stipulations, and any pundits and lobbyists they could afford would spew their dire predictions that "this law will mean the end of computing as we all know it!"

To which my considered answer would be: "Yes, please! That was exactly the idea."


References

1. Hofstadter, D. 1999. Gödel, Escher, Bach. Basic Books.

2. Thompson, K. 1984. Reflections on trusting trust. Communications of the ACM 27 (8): 761-763; http://m.cacm.acm.org/magazines/1984/8/10471-reflections-on-trusting-trust/pdf.


LOVE IT, HATE IT? LET US KNOW

feedback@queue.acm.org


Poul-Henning Kamp (phk@FreeBSD.org) has programmed computers for 26 years and is the inspiration behind bikeshed.org. His software has been widely adopted as "under the hood" building blocks in both open source and commercial products. His most recent project is the Varnish HTTP accelerator, which is used to speed up large Web sites such as Facebook.

© 2011 ACM 1542-7730/11/0900 $10.00

acmqueue

Originally published in Queue vol. 9, no. 9
see this item in the ACM Digital Library

Back to top

  • POUL-HENNING KAMP (phk@FreeBSD.org) is one of the primary developers of the FreeBSD operating system, which he has worked on from the very beginning. He is widely unknown for his MD5-based password scrambler, which protects the passwords on Cisco routers, Juniper routers, and Linux and BSD systems. Some people have noticed that he wrote a memory allocator, a device file system, and a disk encryption method that is actually usable. Kamp lives in Denmark with his wife, his son, his daughter, about a dozen FreeBSD computers, and one of the world's most precise NTP (Network Time Protocol) clocks. He makes a living as an independent contractor doing all sorts of stuff with computers and networks.

    For additional information see the ACM Digital Library Author Page for: Poul-Henning Kamp
     

Comments

Displaying 10 most recent comments. Read the full list here
  • James Phillips | Fri, 30 Sep 2011 21:29:22 UTC

    Revised version of my comment removing hyperlink, quoting relevant text, just over 2000 chars.
    
    After that voting machine hack a few years ago, making use of return-oriented programming, I came to the conclusion that the computer revolution has not happened yet. Currently with ACTA being signed in days, the computer industry is actually regressing: users are allowed less and less information about the hardware they are running. To ignore unproven, untrusted hardware is to ignore the elephant in the room.
    
    Quoting myselft:
    I have an alternate vision of a "trusted computer": one that can be completely trusted and verified by software development houses. Things like keyboard controllers and card readers would have socketed ROM.  The ROM would be burned using a ROM burner programed using hard-wired toggle switches and other discrete components (including a CRC checksum to guard against data-entry error). The "known good", formally proven correct, code would be stored using analog means such as microfiche. All serious compilers would either be deterministic, or support a "--deterministic" switch to compare binaries compiled on different machines. The average person would not be expected to burn their own ROM chips, but they will expect the majority of their software and hardware to be formally proven correct. If an error is later found, the hardware manufacturer or software company would be sued: by a class-action lawsuit if necessary. The lawsuit would succeed because if the software was formally proven correct; any errors must be deliberate sabotage.
    
    We are no where near that yet. The computer industry is still in its infancy. CPU time is seen as cheaper than programmer's time, so software bloat is tolerated. Even if you release a secure OS based on the L4 microkernel, you still have to trust all the hardware that goes into you machine. Modern computers are so complex that nobody knows them from top-to-bottom. This means it is impossible to secure a computer system without proving each abstraction layer implements its advertised interface. Any unpublished quirks can lead to abstraction leakage; which can lead to bugs, which can lead to security exploits.
    
  • Marcus Chen | Fri, 30 Sep 2011 23:00:28 UTC

    In one breath Mr. Kamp decries the immunity that software vendors have for their lack of liability and in the next he attempts to grab that same immunity for open source vendors.  Apparently he believes that a developer is free to compromise the quality as far as he likes and risk the data of the user in any way he likes, in short, to perpetuate the ills of the current software industry, so long as he provides them the source code.  And he believes this despite the full knowledge that the end-user is virtually guaranteed not to have the ability or be able to afford the expertise to evaluate said source code for hazards (let alone to modify the source code).  Now, he may perhaps be forgiven for this belief since we in the industry all know that a developer who releases their source code magically has conferred upon him/her the sublime wisdom of the gods and an iron-hard moral rectitude that puts the saints to shame, but I cannot help but imagine that legislators may be the teeniest, tiniest bit skeptical of this loophole.
    
    There is one point on which we agree: the time has indeed come for software liability.  I respectfully invite Mr. Kamp to join us when he's serious about it instead of using it as a cloak  to push his own agenda.
  • Gary Benner | Sat, 01 Oct 2011 01:58:23 UTC

    We have evolved into an environment where software is distributed when "good enough". This is a state where it's use brings more benefits than downsides. This allows users to contribute to the final product by providing feedback, and bug reporting.
    
    This affects the end user cost of software. If any software development company was working under the liability terms discussed herein, then the cost of (proprietary) software would be astronomical, and the options limited.
    
    As developers we work in a specific layer, mostly application level, however the functioning of our application can be affected by updates / upgrades of compiled libraries, the OS, hardware drivers, and perhaps even the hardware and bios. 
    
    Trying to determine liability here is fraught with legal intrincacies that would make a lawyer salivate with joy. 
    
    It is not a simple world we live in ( software world that is ), and trying to overlay simplistic legal solutions will not work easily. End users require simple solutions, and that is perhaps better achieved in other ways, such as engaging service providers who can, using reasonable endeavours, manage their customers' use of, and difficulties experienced with, technology as a whole.
     
    
    
  • Poul-Henning Kamp | Sat, 01 Oct 2011 12:37:51 UTC

    @Marcus Chen (and others):  First of all, you seem to think that this is about software sold in Best Buy to consumers.  It is not.  It is about all software sold and therefore also about your Government buying databases or Alcometers.  And no, I don't expect everybody to be able to read the source code, but I know the capitalistic society well enough to know, that a new market for software reviews will spring into life, and that certification companies like UL, TUV and Norvegian Veritas would love to sell certifications to sensibly written software.  We need either accountability, or transparency, today we have neither.
  • Kirk | Sat, 15 Oct 2011 13:49:02 UTC

    "Your argument is totally bogus:  FOSS software comes with source code, so I can actually look at it and decide if it is competently written, and I have a license to modify it as I wish. "
    
    While you "can" look at the code and decide, the reality is you don't. There is no way I would believe that you have examined and understood the entirety of the OS that your computer is running. In fact, I have to revise again to say, "No you can't. There is simply too much there."
    
    The simple reality is that you don't like closed source software and have decided that not being content with simply not using such you believe that you should prevent me from doing so as well. Which also leads me to believe that the reality is that you really hope that such a law would force open software that you have not been smart enough to mimic. Which then also shows why the company should have the right to not show you the source.
    
    Your argument serves quite well at exposing your motives. Thank you.
  • tifkap | Sat, 15 Oct 2011 21:30:48 UTC

    Software liability is long overdue. The quality of most code is terrible, and the end of out-sourcing to the cheapest code-monkey is not yet in sight. Limited Liability clauses have no value if it's not legal.
    
    The only way to structurally change security is to let liability in, and let the market sort it out.
     
  • Art | Thu, 03 Nov 2011 20:18:37 UTC

    Here's a more painful way to improve software quality: Immediate disclosure of vulnerability details, including proof-of-concept/working exploits. The situation now is bad, but not quite bad enough for users (customers) to demand change.
    
    Also +1 for the comment about being liable for not having a rapid and competent response to vulnerabilities. That's a nice refinement.
    
    And recall that users (customers) share some (perhaps a lot of) responsibility, e.g. page 14 of the Microsoft SIR vol 11.
  • Stephan Wehner | Mon, 12 Dec 2011 21:22:13 UTC

    The "definition-problem" has been described in earlier comments: it's difficult to describe what is proper use of software, what is improper use, and which unwanted outcomes can in fact be blamed on the software and not the user; even more so when you look at how many different environments software runs under, by how many different people. 
    
    The definition of the functionality of many software packages would often be a huge and unwieldy document, no doubt having its own mistakes and therefore maintenance and update schedule.
    
    A very small step, and one that looks more doable, may be to require vendors to offer a guarantee for some aspect of the software, however small, that they choose themselves ("This software will not reboot your PC," "This software will not access your sound system"). Once some blurb is required, meaningful guarantees may be made in time. When the blurb looks weak, one knows that the product is immature.
    
    Oh well,
    
    Stephan
  • Paul E. Bennett | Sat, 24 Dec 2011 12:00:22 UTC

    As one who is entirely in the bespoke embedded systems realm, I know how much effort has to be invested in getting as near perfect a product as is possible to achieve. Many of the systems I have dealt with could, by their failure, lead to fatal outcomes. That they haven't in my 40+ years of being in the industry shows that the "good enough" on my, and colleagues, parts have held true. Some of the systems have run for 25 years or more without needing patches or updates of any kind (nature of the industries served).
    
    In the UK, the "Consumer Protection Act" has far reaching teeth for the companies that are proven to be negligent in their practices. That negligence would extend to not using "Current Best Practice" or being "Cognicent of Current and  Emerging Standards".  What the article proposes is, therefore, already covered in the UK as far as embedded systems are concerned.
    
    Of course, the embedded systems people have good knowledge of the hardware platform, environment of deployment and interface possibilities that would be incorporated in their remit. We will know our systems relationship to standards such as IEC61508 etc) and how well they measure up to the client requirements.
    
    I don't know many in the desk-top software world will have done a Safety or Security Risk Assessment for their applications before they begin their coding.  So, perhaps some mention in the legislation that demands all software (other than Open Source) should have some risk of failure assessment to a standardised format written into the guarantees. Open Source is only excluded from this by the nature that such is open to modification and adaptation by anyone who feels they are capable of doing so (whether they are or not).
    
  • Mathias Holmgren | Tue, 28 Feb 2012 15:10:56 UTC

    You are either a troll or insane.
    
    "I am trying to impose a civil liability only for unintentionally caused damage, whether a result of sloppy coding, insufficient testing, cost cutting, incomplete documentation, or just plain incompetence."
    
    there is so many false assumptions behind this statement, I cant even begin...
    
    bottom line, what would happen?
    
    - No customer would be willing to play for the explotion of prices to cover hypothetical, unintended applications of software in situation the customer was not willing to pay the testing for in the first place
    
    => no software would get built because nobody would afford it
    
    you are a complete fool
Displaying 10 most recent comments. Read the full list here
Leave this field empty

Post a Comment:

(Required)
(Required)
(Required - 4,000 character limit - HTML syntax is not allowed and will be removed)