Virtualization

Vol. 6 No. 1 – January/February 2008

Virtualization

Network Virtualization: Breaking the Performance Barrier:
Shared I/O in virtualization platforms has come a long way, but performance concerns remain.

The recent resurgence in popularity of virtualization has led to its use in a growing number of contexts, many of which require high-performance networking. Consider server consolidation, for example. The efficiency of network virtualization directly impacts the number of network servers that can effectively be consolidated onto a single physical machine. Unfortunately, modern network virtualization techniques incur significant overhead, which limits the achievable network performance. We need new network virtualization techniques to realize the full benefits of virtualization in network-intensive domains.

by Scot Rixner

A Conversation with Jason Hoffman:
A systems scientist looks at virtualization, scalability, and Ruby on Rails.

Jason Hoffman has a Ph.D. in molecular pathology, but to him the transition between the biological sciences and his current role as CTO of Joyent was completely natural: "Fundamentally, what I’ve always been is a systems scientist, meaning that whether I was studying metabolism or diseases of metabolism or cancer or computer systems or anything else, a system is a system," says Hoffman. He draws on this broad systems background in the work he does at Joyent providing scalable infrastructure for Web applications. Joyent’s cloud-computing infrastructure has become the foundation for many of the increasingly popular applications developed to feed into the social-networking site Facebook.com.

Beyond Server Consolidation:
Server consolidation helps companies improve resource utilization, but virtualization can help in other ways, too.

Virtualization technology was developed in the late 1960s to make more efficient use of hardware. Hardware was expensive, and there was not that much available. Processing was largely outsourced to the few places that did have computers. On a single IBM System/360, one could run in parallel several environments that maintained full isolation and gave each of its customers the illusion of owning the hardware. Virtualization was time sharing implemented at a coarse-grained level, and isolation was the key achievement of the technology. It also provided the ability to manage resources efficiently, as they would be assigned to virtual machines such that deadlines could be met and a certain quality of service could be achieved.

by Werner Vogels

How OSGi Changed My Life:
The promises of the Lego hypothesis have yet to materialize fully, but they remain a goal worth pursuing.

In the early 1980s I discovered OOP (object-oriented programming) and fell in love with it, head over heels. As usual, this kind of love meant convincing management to invest in this new technology, and most important of all, send me to cool conferences. So I pitched the technology to my manager. I sketched him the rosy future, how one day we would create applications from ready-made classes. We would get those classes from a repository, put them together, and voila, a new application would be born. Today we take objects more or less for granted, but if I am honest, the pitch I gave to my manager in 1985 never really materialized. The reuse of objects never achieved the levels foreseen by people such as Brad Cox with his software-IC model, and many others, including myself. Still, this Lego hypothesis remains a grail worth pursuing.

by Peter Kriens

Meet the Virts:
Virtualization technology isn’t new, but it has matured a lot over the past 30 years.

When you dig into the details of supposedly overnight success stories, you frequently discover that they’ve actually been years in the making. Virtualization has been around for more than 30 years since the days when some of you were feeding stacks of punch cards into very physical machines yet in 2007 it tipped. VMware was the IPO sensation of the year; in November 2007 no fewer than four major operating system vendors (Microsoft, Oracle, Red Hat, and Sun) announced significant new virtualization capabilities; and among fashionable technologists it seems virtual has become the new black.

by Tom Killalea

Poisonous Programmers:
A koder with attitude, KV answers your questions. Miss Manners he ain’t.

Dear KV, I hope you don’t mind if I ask you about a non-work-related problem, though I guess if you do mind you just won’t answer. I work on an open source project when I have the time, and we have some annoying nontechnical problems. The problems are really people, and I think you know the ones I mean: people who constantly fight with other members of the project over what seem to be the most trivial points, or who contribute very little to the project but seem to require a huge amount of help for their particular needs. I find myself thinking it would be nice if such people just went away, but I don’t think starting a flame war on our mailing lists over these things would really help. Any thoughts on this nontechnical problem?

by George Neville-Neil

The Cost of Virtualization:
Software developers need to be aware of the compromises they face when using virtualization technology.

Virtualization can be implemented in many different ways. It can be done with and without hardware support. The virtualized operating system can be expected to be changed in preparation for virtualization, or it can be expected to work unchanged. Regardless, software developers must strive to meet the three goals of virtualization spelled out by Gerald Popek and Robert Goldberg: fidelity, performance, and safety.

by Ulrich Drepper

All Things Being Equal?:
New year, another perspective

By the time these belles-lettres reach you, a brand new year will be upon us. Another Year! Another Mighty Blow! as Tennyson thundered. Or as Humphrey Lyttelton (q.g.) might say, "The odious odometer of Time has clicked up another ratchette of entropic torture." Less fancifully, as well as trying hard not to write 2007 on our checks, many of us will take the opportunity to reflect on all the daft things we did last year and resolve not to do them no more. Not to mention all the nice things we failed to do. I have in mind the times when I missed an essential semicolon, balanced by the occasions when inserting a spurious one was equally calamitous. Surely any half-decent computer language should know where my statements are meant to terminate, and then properly redistribute the punctuation provided? The smarter Lisps became good at DWIM (do what I mean), balancing those damned, spurious parentheses. But I digress, having planted a topic known to incite reader feedback.

by Stan Kelly-Bootle

The Ever Expanding Ecosystem for Embedded Computing:
Mike Vizard from ACM Queue talks with Oracle’s Mike Olson about the changing architecture of network-enabled applications. Olson explains the thinking behind the company’s new focus on embedded database and middleware technology. He explores the technical, business and economic forces shaping this fast-growing market. Tune in to learn how Oracle plans to serve customers way outside the enterprise.

Mike Vizard from ACM Queue talks with Oracle’s Mike Olson about the changing architecture of network-enabled applications. Olson explains the thinking behind the company’s new focus on embedded database and middleware technology. He explores the technical, business and economic forces shaping this fast-growing market. Tune in to learn how Oracle plans to serve customers way outside the enterprise.