We're a venture capitalist and a communications researcher, and we come bearing bad news: optical computers and all-optical networks aren't going to happen anytime soon. All those well-intentioned stories about computers operating at the speed of light, computers that would free us from Internet delays and relieve us from the tyranny of slow and hot electronic devices were, alas, overoptimistic. We won't be computing or routing at the speed of light anytime soon. (In truth, we probably should have told you this about two years ago, but we only recently met, compared notes, and realized our experiences were consistent.)
You see, building an optical computer or router entails a critical step called optical regeneration, which nobody knows how to do. After at least two decades of research and well over a billion dollars of venture-capital spending on promising potential breakthroughs, it is pretty clear that we've tried all the obvious ways and a fair number of the nonobvious ways to do optical regeneration. It appears that solving the problem is not a matter of top-class engineering; rather, it's beginning to look like Nobel-Prize-winning physics, such as high-temperature superconductivity where ingenuity leads to a completely new approach. Those kinds of results are rare and unpredictable, and thus all signs suggest optical computing is an innovation that will likely have to wait a generation or two or three to come to fruition.
The past two decades have seen a profusion of optical logic that can serve as memories, comparators, or other similar bits of logic that we would need to build an optical computer. Much like traditional silicon logic, this optical logic suffers from signal loss—that is, in the process of doing the operation or computation, some number of decibels is lost. In optics, the loss is substantial—an optical signal can traverse only a few circuits before it must be amplified.
We know how to optically amplify a signal. Indeed, optical amplification is one of the great innovations of the past 20 years and has tremendously increased the distances over which we can send an optical signal.
Unfortunately, amplifying a signal adds noise. After a few amplifications, we need to regenerate the signal: we need a device that receives a noisy signal and emits a crisp clean signal. Currently the only way to build regenerators is to build an OEO (optical-electronic-optical) device: the inbound signal is translated from the optical domain into a digitized sample; the electronic component removes the noise from the digitized sample and then uses the cleaned-up digitized sample to drive a laser that emits a clean signal in the optical domain. OEO regenerators work just fine, but they slow us down by forcing us to work at the speed of electronics.
There has been no shortage of attempts to create all-optical regenerators. Many approaches have shown some promise in the laboratory, but ultimately, to date, all have failed the transition from promise to product. As we observed in the introduction, all the likely—and many unlikely—approaches have been tried.
So, if we are going to assemble the wonderful optical circuits into an optical computer, right now and for the foreseeable future, for every handful of circuits we will need a regenerator, and the only regenerators we have require slow electronics. Oops!
Compounding the frustration is that the photonic logic community has recently achieved breakthroughs in PICs (photonic integrated circuits). Until recently, each optical circuit was its own chip (much as we relied on individual transistors in electronic devices in the 1960s), but now we can lay out densely packed optical chips. We can envision replacing electronic chips with optical chips—but the optical chips won't run any faster because every few circuits, inside the chips, we'll have to do electrical regeneration.
Similar problems crop up in building all-optical networks. There have to be some optical switches in that network to direct the data. The optical logic in those optical switches has the same problems as optical computers: every few circuits you need OEO regeneration.
Many people had been anticipating a future of all-optical computers connected via all-optical networks, a nirvana of high performance combined with low error rates and lower power consumption and heat dissipation. We're sorry to be naysayers, but you can stop holding your breath waiting for this to happen.
All the same, one should not lose heart. There are plenty of opportunities to exploit the wonderful characteristics of optics in a hybrid electronic-optical world.
First, PICs have unleashed a tremendous surge in innovation. In 2005, the optical research community wrote a report for the National Science Foundation on research problems for the next five years and next 10 years. Three years later, some of those research problems are solved and in products! As a result, the amount of data we can push through an individual fiber is increasing sharply. We're also able to manage that capacity with increasing sophistication. These results are probably only the low-hanging fruit of what PICs have enabled, and we're likely to see more innovation in coming years. If your biggest concern is getting lots of bandwidth with low error rates, the future looks very good indeed.
Second, optical logic continues to develop new capabilities. We offer two examples to show the range of work. At Harvard a few years ago, researchers were able to slow and then stop (hold stationary) a pulse of light. The immediately visible opportunities are for better optical memories and to manage data rates inside a device. More opportunities will no doubt appear. A much more concrete effort is the DARPA-funded OAWG (Optical Arbitrary Waveform Generation) program. OAWG seeks to build radically improved optical transceivers, capable of producing optical pulses that are more coherent and have less noise. These transceivers would allow us to pack more optical channels into a fiber, because we would need smaller gaps between channel frequencies to protect ourselves from cross-channel noise.
In summary, the future of optical technology is bright. It just isn't taking the path to the future that many of us imagined or hoped for.
LOVE IT, HATE IT? LET US KNOW
CHARLES BEELER is a venture-capital investor with El Dorado Ventures, where his primary focus is on companies with the potential to radically change the capabilities, cost/performance, and energy efficiency of data centers and enterprise computing environments. Beeler has also served as a partner at Piper Jaffray Ventures, helping to manage technology funds, and at Scripps Ventures. He received a bachelor's degree in economics from Colby College and an MBA from the University of Pennsylvania's Wharton School.
DR. CRAIG PARTRIDGE is a former chair of ACM SIGCOMM and, long ago, was editor-in-chief of ACM Computer Communication Review. An ACM Fellow, he is chief scientist for networking research at BBN Technologies.
© 2009 ACM 1542-7730 /09/0200 $5.00
Originally published in Queue vol. 7, no. 3—
see this item in the ACM Digital Library
Theo Schlossnagle - Time, but Faster
A computing adventure about time through the looking glass
Neal Cardwell, Yuchung Cheng, C. Stephen Gunn, Soheil Hassas Yeganeh, Van Jacobson - BBR: Congestion-Based Congestion Control
Measuring bottleneck bandwidth and round-trip propagation time
Josh Bailey, Stephen Stuart - Faucet: Deploying SDN in the Enterprise
Using OpenFlow and DevOps for rapid development
Amin Vahdat, David Clark, Jennifer Rexford - A Purpose-built Global Network: Google's Move to SDN
A discussion with Amin Vahdat, David Clark, and Jennifer Rexford
(newest first)This article seems to obsessed with the idea that a 1 is high intensity and a 0 is no to low intensity. Light has many properties (frequency, polarization, etc) that could be exploited to do computation at a constant intensity, therefore reducing or eliminating the need for optical amplifiers.
Beeler and Partridge are in for a Kelvinian faux pas.