Graphics

RSS
Sort By:

Modeling People and Places with Internet Photo Collections:
Understanding the world from the sea of online photos

This article describes our work in using online photo collections to reconstruct information about the world and its inhabitants at both global and local scales. This work has been driven by the dramatic growth of social content-sharing Web sites, which have created immense online collections of user-generated visual data. Flickr.com alone currently hosts more than 6 billion images taken by more than 40 million unique users, while Facebook.com has said it grows by nearly 250 million photos every day.

by David Crandall, Noah Snavely | May 11, 2012

6 comments

Interactive Dynamics for Visual Analysis:
A taxonomy of tools that support the fluent and flexible use of visualizations

The increasing scale and availability of digital data provides an extraordinary resource for informing public policy, scientific discovery, business strategy, and even our personal lives. To get the most out of such data, however, users must be able to make sense of it: to pursue questions, uncover patterns of interest, and identify (and potentially correct) errors. In concert with data-management systems and statistical algorithms, analysis requires contextualized human judgments regarding the domain-specific significance of the clusters, trends, and outliers discovered in data.

by Jeffrey Heer, Ben Shneiderman | February 20, 2012

3 comments

A Conversation with Ed Catmull:
The head of Pixar Animation Studios talks tech with Stanford professor Pat Hanrahan.

With the release of Toy Story in 1995, Pixar Animation Studios President Ed Catmull achieved a lifelong goal: to make the world’s first feature-length, fully computer-generated movie. It was the culmination of two decades of work, beginning at the legendary University of Utah computer graphics program in the early 1970s, with important stops along the way at the New York Institute of Technology, Lucasfilm, and finally Pixar, which he cofounded with Steve Jobs and John Lasseter in 1986. Since then, Pixar has become a household name, and Catmull’s original dream has extended into a string of successful computer-animated movies. Each stage in his storied career presented new challenges, and on the other side of them, new lessons.

November 13, 2010

5 comments

Photoshop Scalability: Keeping It Simple:
Clem Cole and Russell Williams discuss Photoshop’s long history with parallelism, and what they now see as the main challenge.

Over the past two decades, Adobe Photoshop has become the de facto image-editing software for digital photography enthusiasts, artists, and graphic designers worldwide. Part of its widespread appeal has to do with a user interface that makes it fairly straightforward to apply some extremely sophisticated image editing and filtering techniques. Behind that façade, however, stands a lot of complex, computationally demanding code. To improve the performance of these computations, Photoshop’s designers became early adopters of parallelism through efforts to access the extra power offered by the cutting-edge desktop systems of the day that were powered by either two or four processors.

by Clem Cole, Russell Williams | September 9, 2010

3 comments

Software Development with Code Maps:
Could those ubiquitous hand-drawn code diagrams become a thing of the past?

To better understand how professional software developers use visual representations of their code, we interviewed nine developers at Microsoft to identify common scenarios, and then surveyed more than 400 developers to understand the scenarios more deeply.

by Robert DeLine, Gina Venolia, Kael Rowan | July 4, 2010

1 comments

Visualizing System Latency:
Heat maps are a unique and powerful way to visualize latency data. Explaining the results, however, is an ongoing challenge.

When I/O latency is presented as a visual heat map, some intriguing and beautiful patterns can emerge. These patterns provide insight into how a system is actually performing and what kinds of latency end-user applications experience. Many characteristics seen in these patterns are still not understood, but so far their analysis is revealing systemic behaviors that were previously unknown.

by Brendan Gregg | May 28, 2010

16 comments

A Tour through the Visualization Zoo:
A survey of powerful visualization techniques, from the obvious to the obscure

Thanks to advances in sensing, networking, and data management, our society is producing digital information at an astonishing rate. According to one estimate, in 2010 alone we will generate 1,200 exabytes -- 60 million times the content of the Library of Congress. Within this deluge of data lies a wealth of valuable information on how we conduct our businesses, governments, and personal lives. To put the information to good use, we must find ways to explore, relate, and communicate the data meaningfully.

by Jeffrey Heer, Michael Bostock, Vadim Ogievetsky | May 13, 2010

25 comments

A Conversation with Jeff Heer, Martin Wattenberg, and Fernanda Viégas:
Sharing visualization with the world

Visualization can be a pretty mundane activity: collect some data, fire up a tool, and then present it in a graph, ideally with some pretty colors. But all that is changing. The explosion of publicly available data sets on the Web, coupled with a new generation of collaborative visualization tools, is making it easier than ever to create compelling visualizations and share them with the world.

March 23, 2010

0 comments

Future Graphics Architectures:
GPUs continue to evolve rapidly, but toward what?

Graphics architectures are in the midst of a major transition. In the past, these were specialized architectures designed to support a single rendering algorithm: the standard Z buffer. Realtime 3D graphics has now advanced to the point where the Z-buffer algorithm has serious shortcomings for generating the next generation of higher-quality visual effects demanded by games and other interactive 3D applications. There is also a desire to use the high computational capability of graphics architectures to support collision detection, approximate physics simulations, scene management, and simple artificial intelligence.

by William Mark | April 28, 2008

0 comments

Scalable Parallel Programming with CUDA:
Is CUDA the parallel programming model that application developers have been waiting for?

The advent of multicore CPUs and manycore GPUs means that mainstream processor chips are now parallel systems. Furthermore, their parallelism continues to scale with Moore’s law. The challenge is to develop mainstream application software that transparently scales its parallelism to leverage the increasing number of processor cores, much as 3D graphics applications transparently scale their parallelism to manycore GPUs with widely varying numbers of cores.

by John Nickolls, Ian Buck, Michael Garland, Kevin Skadron | April 28, 2008

1 comments

Data-Parallel Computing:
Data parallelism is a key concept in leveraging the power of today’s manycore GPUs.

Users always care about performance. Although often it’s just a matter of making sure the software is doing only what it should, there are many cases where it is vital to get down to the metal and leverage the fundamental characteristics of the processor.

by Chas. Boyd | April 28, 2008

0 comments

GPUs: A Closer Look:
As the line between GPUs and CPUs begins to blur, it’s important to understand what makes GPUs tick.

A gamer wanders through a virtual world rendered in near- cinematic detail. Seconds later, the screen fills with a 3D explosion, the result of unseen enemies hiding in physically accurate shadows. Disappointed, the user exits the game and returns to a computer desktop that exhibits the stylish 3D look-and-feel of a modern window manager. Both of these visual experiences require hundreds of gigaflops of computing performance, a demand met by the GPU (graphics processing unit) present in every consumer PC.

by Kayvon Fatahalian, Mike Houston | April 28, 2008

0 comments

A Conversation with Kurt Akeley and Pat Hanrahan:
Graphics veterans debate the evolution of the GPU.

Interviewing either Kurt Akeley or Pat Hanrahan for this month’s special report on GPUs would have been a great opportunity, so needless to say we were delighted when both of these graphics-programming veterans agreed to participate. Akeley was part of the founding Silicon Graphics team in 1982 and worked there for almost 20 years, during which he led the development of several high-end graphics systems, including GTX, VGX, and RealityEngine. He’s also known for his pioneering work on OpenGL, the industry-standard programming interface for high-performance graphics hardware.

by John Stanik | April 28, 2008

0 comments

Get Your Graphics On:
OpenGL Advances with the Times

OpenGL, the decade-old mother of all graphics application programming interfaces (APIs), is getting two significant updates to bring it into the 21st century.

by Alexander Wolfe | April 16, 2004

0 comments