We work in an industry that prides itself on “changing the world,” one that chants a constant mantra of innovation and where new products could aptly be described as “this year’s breakthrough of the century.” While there are some genuine revolutions in the technology industry, including cellphones, GPS (global positioning system), quantum computing, encryption, and global access to content, the vast majority of new product introductions are evolutionary, not revolutionary. Real technical breakthroughs are few and far between. Most new products are just a recycling of an earlier idea.
The effort seems to go into refinement, not into changing the core paradigm. When I look back to when I was in graduate school some three decades ago, I believed the world would have changed dramatically by now. Pure research in computer science was introducing new concepts at breathtaking speeds. Looking back on that time, things have not changed all that much. Most of the technologies that I studied in graduate school are still widely used. The way work gets accomplished is relatively unchanged. For the most part, we see the same recycled concepts presented over and over with shiny new paint jobs, touted as new ground-breaking solutions. It seems to me that the pace of innovation has actually slowed. As private investment poured into R&D (research and development), dwarfing government funding, the vast majority of intellectual property became privately held. As a result, we have seen less, not more, genuine innovation. Let me give you some examples.
A while back I met a guy who was launching a company to do corporate accounting as a service over the Web. He had made a lot of money during the course of his career and was confident that he could repeat his success in this new venture.
Why was he so confident? He had implemented the same basic business over and over again by just changing the technology platform to fit what was in vogue. By taking this approach, he had developed successful accounting businesses on minicomputers, workstations, PCs, client/server machines, and at that time was working to become an ASP (application service provider). He was confident that he would be successful because the benefits provided to the customer remained relatively unchanged. Only the technology platform and price point differed.
For those of us old enough to remember, the Internet is more than 30 years old. Yes, it has changed and evolved, but the basic idea of internetworking individual computers into a large WAN is old stuff. Many of the networking constructs in use today are 20 to 30 years old. I expected by now to be using a much richer set of control structures than what is currently termed “state-of-the-art.”
TCP/IP started out in the early 1980s and is relatively unchanged. FTP, which has been around since the mid-1970s, hasn’t changed much at all. Telnet, which has been mostly superceded by its encryption-enabled twin SSH (Secure Shell), is still in heavy use as a character-based terminal emulation based on an obsolete product from a nonexistent company (Digital Equipment Corporation’s VT100). Moreover, if you look at the basic architecture of a Web server, you are going to see a lot in common with the FTP and Telnet protocols, with the addition of graphics rendering and some added automation. Similar observations can be made regarding e-mail using POP and SMTP protocols and RPC (Remote Procedure Call) as it evolved through NFS (Network File System), CORBA, and SOAs (service-oriented architectures). With a small set of exceptions, the basic control structures available for computer networking have changed little in the past 20 years.
In graduate school I studied the IBM 360/67 and CP-67, late-1960s technology that showed how a single physical machine could support multiple virtual machines. Now this technology has become red-hot as a way to compartmentalize a large number of application services on a single large server. On the other end of the spectrum, companies are borrowing the HPC (high-performance computing) architectures of supercomputers to link multiple generic servers via Gigabit Ethernet, dynamically creating clusters that support application services. These mature paradigms have been rediscovered as a result of the continual drive toward increased IT efficiency.
The decline of pure research in computer technology is to blame. With the majority of R&D undertaken for commercial purposes, overall risk is minimized while ROI (return on investment) maximized. The commercial aim is to produce a solution that the customer can buy and integrate right now, not to rethink the entire approach. As industry continues to pick off low-hanging fruit, true innovation, which by definition is high risk, becomes more infrequent. As a result, things start to stagnate and people like me bemoan the lack of true innovation.
One example is the failure of the industry to leverage the increase in desktop computational resources over the past 25 years. While we have gone from a 4-MHz CPU with a 360-KB floppy disk to a 4-GHz CPU with 500-GB disks—a thousandfold increase in CPU and a millionfold increase in storage—the end user benefits remain relatively unchanged. People still struggle to debug spreadsheet formulas.
The massive accrual of business-method and algorithm patents by companies has further exacerbated the problem. Pioneered by IBM, large patent portfolios have become competitive weapons and profit centers, fencing off vast tracts of intellectual property from innovation by other parties. Large companies can afford to enter into cross-license agreements with each other, sharing patent portfolios. These tactics, however, lock out innovation by smaller companies and research institutions that have no portfolios to trade or cannot afford the license fees.
What to do? To grow the innovation pie, industry participants have to recognize two fundamental facts. First, they must manage R&D budgets as investment portfolios with the majority of projects in a low-risk, moderate-return category and a select set of projects in a higher-risk and higher-return category. If companies recognize that it is OK to fail sometimes, they can reap huge rewards when a high-risk venture succeeds. Second, something has to happen to break the stifling effect of large intellectual property portfolios. Either patent law must be reformed or companies should consider donating patents to the public domain to increase innovation across the board. Both of these ideas are being explored.
I hope something happens soon. I hate discussing start-ups and business initiatives with my peers like we are pitching a new movie in LA—a small twist on yet another sequel.
MACHE CREEGER (email@example.com) is a 30-year technology industry veteran based in Silicon Valley. He is the principal of Emergent Technology Associates, business development consultants to technology companies worldwide.
Originally published in Queue vol. 4, no. 3—
see this item in the ACM Digital Library