Emulators

Vol. 8 No. 4 – April 2010

Emulators

Simplicity Betrayed:
Emulating a video system shows how even a simple interface can be more complex—and capable—than it appears.

An emulator is a program that runs programs built for different computer architectures from the host platform that supports the emulator. Approaches differ, but most emulators simulate the original hardware in some way. At a minimum the emulator interprets the original CPU instructions and provides simulated hardware-level devices for input and output. For example, keyboard input is taken from the host platform and translated into the original hardware format, resulting in the emulated program "seeing" the same sequence of keystrokes. Conversely, the emulator will translate the original hardware screen format into an equivalent form on the host machine.

by George Phillips

Avoiding Obsolescence:
Overspecialization can be the kiss of death for sysadmins.

Dear KV, What is the biggest threat to systems administrators? Not the technical threat (security, outages, etc.), but the biggest threat to systems administrators as a profession?

by George Neville-Neil

Principles of Robust Timing over the Internet:
The key to synchronizing clocks over networks is taming delay variability.

Everyone, and most everything, needs a clock, and computers are no exception. Clocks tend to drift off if left to themselves, however, so it is necessary to bring them to heel periodically through synchronizing to some other reference clock of higher accuracy. An inexpensive and convenient way to do this is over a computer network.

by Julien Ridoux, Darryl Veitch

A Tour through the Visualization Zoo:
A survey of powerful visualization techniques, from the obvious to the obscure

Thanks to advances in sensing, networking, and data management, our society is producing digital information at an astonishing rate. According to one estimate, in 2010 alone we will generate 1,200 exabytes - 60 million times the content of the Library of Congress. Within this deluge of data lies a wealth of valuable information on how we conduct our businesses, governments, and personal lives. To put the information to good use, we must find ways to explore, relate, and communicate the data meaningfully.

by Jeffrey Heer, Michael Bostock, Vadim Ogievetsky

Why Cloud Computing Will Never Be Free:
The competition among cloud providers may drive prices downward, but at what cost?

The last time the IT industry delivered outsourced shared-resource computing to the enterprise was with timesharing in the 1980s, when it evolved to a high art, delivering the reliability, performance, and service the enterprise demanded. Today, cloud computing is poised to address the needs of the same market, based on a revolution of new technologies, significant unused computing capacity in corporate data centers, and the development of a highly capable Internet data communications infrastructure. The economies of scale of delivering computing from a centralized, shared infrastructure have set the expectation among customers that cloud-computing costs will be significantly lower than those incurred from providing their own computing. Together with the reduced deployment costs of open source software and the perfect competition characteristics of remote computing, these expectations set the stage for fierce pressure on cloud providers to continuously lower prices.

by Dave Durkee