BIGGER THEN THE
SUM OF ITS PARTS

Report On Business Magazine
January 28, 2000
by Clive Thompson

There are 43 million computers on the Net right now. What if you could get all of them to work together at once?

Say you're a computer science student with an incredibly complex problem - a massive piece of calculation that even an expensive supercomputer would take years, if not lifetimes, to solve. What do you do? Simple: Break the problem into little pieces and hand them out to your friends on the Net.

That's what Adam L. Beberg did three years ago while in university. As a promotional stunt, RSA Labs, a U.S. data-encryption company, offered $10,000 (U.S.) to anyone who could decode a message encrypted with its industry-standard, 56-bit algorithm. Beberg knew the easiest way to break it - trying one combination after another until one worked - would take decades, even using the fastest supercomputer on earth.

Instead, he tried an approach called "distributed computing." He and a group of programmer friends wrote software that broke the calculation into thousands of small pieces that could be worked out in only a few hours on an ordinary, everyday desktop computer. When they posted a notice on-line, thousands of people downloaded the software. They hooked up 26,000 computers worldwide, which together ran a staggering seven billion combinations per second. In only 212 days, they cracked the code and won the prize. "We basically assembled a supercomputer from shared resources on the Net," says Beberg, now 26 years old. And they had spent nothing.

Since their success, distributed computing has moved slowly into the mainstream. More and more researchers are discovering that it is a surprisingly viable idea - a form of supercomputing for the masses.

It relies upon a central irony of our digital age: Most computers are woefully underused. Consider my own desktop computer. It's got a 500 MHz Pentium processor, 256 megs of memory, and can perform a huge number of calculations per second. Yet most of the time, while sitting here typing, I'm using an almost unmeasurably small percentage of its power. In the downtime, it's just sitting there, twiddling its thumbs.

The whole point of distributed computing is to use these wasted resources. Link up those dormant computers, and presto, you can dwarf even the biggest supercomputer in the world - the U.S. government's Blue Pacific, which exists to model nuclear-bomb explosions. It uses 5,856 built-in processors to achieve 3.88 trillion floating-point instructions per second. In comparison, the Net has 43 million computers hooked up to it. To surpass Blue Pacific, only a small percentage of their owners have to donate resources to your computing problem.

This sort of computational philanthropy is quickly gaining appeal for high-profile research projects. For example, the SETI@home project - the Search for Extraterrestrial Intelligence, run by the University of California at Berkeley - began using a distributed-computing approach to process signals from radio telescopes. They created a distributed version of their processing software. Within months, more than 1.4 million people signed up to help process their data. Today, participants in more than 223 countries have done the work that one computer would take 118,766 years to do. Their distributed project, they say, is the fastest computer on earth.

In fact, there are so many computers involved in SETI@home that the organizers can no longer keep them all busy. They can't feed the data in fast enough. "We are literally overwhelmed with participants," says David Anderson, the director of the project. "We could never have afforded to buy a supercomputer to do this. And yet now we have a surplus of computational power."

Are there any serious commercial applications for distributed computing? Proponents are betting on it. Beberg figures that a distributed approach would appeal to any industry requiring hard-core data-crunching. Pharmaceutical companies do protein modelling on new drugs; financial services do data mining; even Hollywood needs ever larger computers for rendering 3-D animation. Granted, security issues abound. But Beberg points out that large corporations can apply a distributed approach using the hundreds of largely dormant computers on their in-house networks.

Other companies are already out on the market. Active Tools, based in San Francisco, Calif., was founded by two Australian professors who used distributed computing to run pollution experiments; 50 linked computers took only three days to do a job that would occupy one computer for two months. Now they're licensing their distributed-computing software to companies at the rate of up to $400 (U.S.) per computer. "This is an excellent way to use a company's computer resources in a new, very efficient way," says Active Tools president and CEO Rok Sosic.

Still, distributed fans are realistic in their expectations. Not all computational problems can be solved using this approach; the calculation has to break easily into chunks. So distributed computing probably won't replace supercomputing any time soon. But it's almost more important for what it illustrates about the social dynamics of the Net. Sometimes the whole really is greater than the sum of its parts.


© Mithral Inc. 1995-2024. All Rights Reserved.
Mithral® and Cosm® are trademarks of Mithral Inc.