Wholesale Computation

Scientific American
November, 2000
by Paul Wallich

Companies want to sell your computer's spare processing time. Are there buyers?

The fastest supercomputers in the known universe are virtually free. All you need to beat the performance of a $50-million, massively parallel research machine is a little software and some way to convince 1 percent of the people on the Internet to run it. Unlike a dedicated supercomputer, which generally requires special housing and a staff of attendants to keep it going while it falls rapidly behind the state of the art, the network equivalent increases in power regularly as people upgrade their PCs. And when you're done using the virtual supercomputer, you can stop paying for it. Little wonder, then, that more than a dozen startups should have appeared in the past year, all trying to scoop up spare computing cycles and sell them to the highest bidder.

The best-known example of virtual supercomputing is the volunteer SETI@home project, a search for radio signals from an extraterrestrial intelligence; it has attracted more than two million participants. Following in the footsteps of code-breaking ventures such as distributed.net, SETI@Home can run as a screensaver; then it is active only when a machine is not doing anything else. Each chunk of radio-telescope data can be processed independently, so machines don't need to communicate with one another, only with a central server. Other embarrassingly parallel problems include DNA pattern matching, Monte Carlo financial modeling, computer-graphics rendering and, appropriately enough, Web site-performance testing. Genome applications alone, says United Devices CEO Ed Hubbard, could soak up all the Net's spare computing power for the next 50 years.

Only two questions stand between the venture capitalists and enormous profits: Can they get millions of users to surrender CPU time to profit-making organizations, and can they sell the resulting power to enough paying customers? Steve Porter of ProcessTree Network has little doubt that his company can retain the 100,000 people currently donating time to nonprofit computations by offering payments of between $100 to $1,000 a year (depending on processor speed and Internet bandwidth). That, he says, will enable him to sell a standard CPU-year (a 400-megahertz Pentium II operating full-time for 365 days) for about $1,500, or less than a fifth the cost of equivalent time on a supercomputer. Nelson Minar of PopularPower expects that even lesser incentives, say between $60 and $200, would still cut individuals' Internet access bills in half--or add up to a tidy sum for schools and libraries. And at Centrata, business development vice president Boris Pevzner says his company intends to bypass individual recruiting entirely and use its high-powered venture-capital contacts to get computer manufacturers and Internet access providers to build the company's software into their products, where it will operate automatically. Meanwhile Adam L. Beberg, one of the founders of distributed.net and now an independent software developer, predicts that no one will make money reselling computer power--too many sellers, not enough buyers. Completely open distributed computing has intractable security problems that will prevent firms from putting sensitive code and data out on the Internet for everyone to see. "The only market is behind firewalls," he says.

Andrew Grimshaw of Applied Meta agrees: "Most businesses won't buy consumer-grade [computing] resources from some Linux hacker's dorm room." Beberg and Grimshaw both argue that the real money is to be made with corporate networks, where tens of thousands of well-administered machines sit idle every night. (Applied Meta currently operates for the National Science Foundation a seamless, secure network of more than 4,000 CPUs.)

Proponents downplay such worries, pointing out that encryption, along with the very decentralized nature of the computing, make it unlikely that an adversary will be able to piece together more than a tiny bit of the big picture. Porter says that his company is mostly bidding on projects based on publicly available data and algorithms--it's only the computing power that his clients need. Minar points out that there's just as much need to protect PCs from potentially malicious distributed code. His company places programs in a Java-language "sandbox" that isolates them to prevent unauthorized access to a user's own information.

Moreover, it isn't just cycles that will be for sale. Centrata and Applied Meta, for example, both tout their ability to store information on what looks like one enormous disk. (Redundancy and encryption are just the beginning of the techniques required to make sure that the data are consistently available to the owners and inaccessible to anyone else.) Porter and others are also looking forward to trading in bandwidth: a PC with a megabit-per-second Internet connection, typical of cable modems and DSL connections, could cache data from distant Web sites and serve them to neighboring users, reducing the load on Internet backbones. (Companies such as Akamai are already doing a rapidly growing business in such "edge" caches, but their approach requires dedicated hardware.) So in a few years, your computer could be surfing the Net looking for the best bids for its spare resources. But will the ready availability of computing power to handle peak processing loads end up curtailing the rapid increases in CPU speed that make distributed computing attractive, or will the ability to solve problems that were utterly unapproachable only a few years ago whet appetites for yet more power? That issue might not even concern the startups. It's possible that widely disseminated distributed-processing software--such as that recently released by Beberg and his friends--will allow buyers and sellers to work directly, leaving the intermediaries hoping to sell your computer power out in the cold.

© Mithral Inc. 1995-2024. All Rights Reserved.
Mithral® and Cosm® are trademarks of Mithral Inc.