Emulators

Vol. 8 No. 4 – April 2010

Emulators

Avoiding Obsolescence

Overspecialization can be the kiss of death for sysadmins.

Avoiding Obsolescence

Overspecialization can be the kiss of death for sysadmins.



Dear KV,

What is the biggest threat to systems administrators? Not the technical threat (security, outages, etc.), but the biggest threat to systems administrators as a profession?

by George Neville-Neil

Articles

Simplicity Betrayed

Emulating a video system shows how even a simple interface can be more complex—and capable—than it appears.

Simplicity Betrayed

Emulating a video system shows how even a simple interface can be more complex—and capable—than it appears.


George Phillips


An emulator is a program that runs programs built for different computer architectures from the host platform that supports the emulator. Approaches differ, but most emulators simulate the original hardware in some way. At a minimum the emulator interprets the original CPU instructions and provides simulated hardware-level devices for input and output. For example, keyboard input is taken from the host platform and translated into the original hardware format, resulting in the emulated program "seeing" the same sequence of keystrokes. Conversely, the emulator will translate the original hardware screen format into an equivalent form on the host machine.

The emulator is similar to a program that implements JVM (Java Virtual Machine). The difference is merely one of degree. JVM is designed to enable efficient and tractable implementations, whereas an emulator's machine is defined by real hardware that generally imposes undesirable constraints on the emulator. Most significantly, the original hardware may be fully described only in terms of how existing software uses it. JVM tends to be forward looking with the expectation that new code will be written and run under JVM to increase its portability. Emulators tend to be backward looking, expecting only to make old code more portable.

by George Phillips

Why Cloud Computing Will Never Be Free

The competition among cloud providers may drive prices downward, but at what cost?

The competition among cloud providers may drive prices downward, but at what cost?


Dave Durkee, ENKI


The last time the IT industry delivered outsourced shared-resource computing to the enterprise was with timesharing in the 1980s, when it evolved to a high art, delivering the reliability, performance, and service the enterprise demanded. Today, cloud computing is poised to address the needs of the same market, based on a revolution of new technologies, significant unused computing capacity in corporate data centers, and the development of a highly capable Internet data communications infrastructure. The economies of scale of delivering computing from a centralized, shared infrastructure have set the expectation among customers that cloud-computing costs will be significantly lower than those incurred from providing their own computing. Together with the reduced deployment costs of open source software and the perfect competition characteristics of remote computing, these expectations set the stage for fierce pressure on cloud providers to continuously lower prices.

This pricing pressure results in a commoditization of cloud services that deemphasizes enterprise requirements such as guaranteed levels of performance, uptime, and vendor responsiveness, much as has been the case with the Web-hosting industry. Notwithstanding, it is the expectation of enterprise management that operating expenses be reduced through the use of cloud computing to replace new and existing IT infrastructure. This difference between expectation and what the industry can deliver at today's near-zero price points represents a challenge, both technical and organizational, that will have to be overcome to ensure large-scale adoption of cloud computing by the enterprise.

by Dave Durkee

Principles of Robust Timing over the Internet

The key to synchronizing clocks over networks is taming delay variability.

The key to synchronizing clocks over networks is taming delay variability.


Julien Ridoux and Darryl Veitch, University of Melbourne


Everyone, and most everything, needs a clock, and computers are no exception. Clocks tend to drift off if left to themselves, however, so it is necessary to bring them to heel periodically through synchronizing to some other reference clock of higher accuracy. An inexpensive and convenient way to do this is over a computer network.

Since the early days of the Internet, a system collectively known as NTP (Network Time Protocol) has been used to allow client computers, such as PCs, to connect to other computers (NTP servers) that have high-quality clocks installed in them. Through an exchange of packet timestamps transported in NTP-formatted packets over the network, the PC can use the server clock to correct its own. As the NTP clock software, in particular the ntpd daemon, comes packaged with all major computer operating systems, including Mac OS, Windows, and Linux, it is a remarkably successful technology with a user base on the order of the global computer population.

by Julien Ridoux, Darryl Veitch

A Tour through the Visualization Zoo

A survey of powerful visualization techniques, from the obvious to the obscure

Thanks to advances in sensing, networking, and data management, our society is producing digital information at an astonishing rate. According to one estimate, in 2010 alone we will generate 1,200 exabytes60 million times the content of the Library of Congress. Within this deluge of data lies a wealth of valuable information on how we conduct our businesses, governments, and personal lives. To put the information to good use, we must find ways to explore, relate, and communicate the data meaningfully.

by Jeffrey Heer, Michael Bostock, Vadim Ogievetsky