In basic terms grid computing can be described as a network of computers and data storage systems, brought together to share computing power. Where a computer is not being used, or is using only a fraction of its power, the grid will allow that power to be used by someone else.
The concept differs from the World Wide Web, which only enables communication through browsers, because it actually allows access to computer resources. It is also different from peer-to-peer computing, which enables file-sharing between two users, because it allows sharing of resources among many, not just two.
The potential of computer grids is enormous and when the concept becomes mainstream it holds the promise of transforming the computer power available to the individual. At present a computer user is restricted by the power of his own computer. When the grid comes on line there will be no restrictions: the cheapest, oldest model will have access to the computing resources of millions of other computers worldwide.
Grid computing was recently championed by Larry Ellison, CEO of software giant Oracle. "Finally, 40 years after the invention of the mainframe computer, we've something to talk about again in the computing world," he told the recent Oracle World 2003 convention when launching Oracle's '10g Grid Computing Suite' to allow an organisation to pool computing resource, to treat that resource as a single computer and to balance the required processing load across it.
However, Carly Fiorina, CEO of Hewlett-Packard, warned the same convention of overexposing grid computing. "It would take three to five years before grid computing can be used as the foundation for companies' payroll and other business systems," she said.
CERN meanwhile, although not responsible for the creation of the grid system, is taking the lead in developing its academic potential. This is not CERN's first foray into new information and communication techniques. In late 1989, Tim Berners-Lee, while working at CERN, devised an internet application to allow scientists at several international sites to access technical documents – and the World Wide Web was born.
The driving force behind CERN's grid is the need to analyse the massive volume of data that will be produced when its latest and largest ever particle accelerator (known as the Large Hadron Collider, or LHC) becomes operational in 2007.
CERN estimates that analysing the data will require the equivalent of 70,000 of today's fastest PCs, and views grid computing as the best way of achieving this.
Les Robertson, CERN's LHC Computer Grid (LCG) project manager, said,
"The LCG will provide a vital test-bed for the new grid computing technologies that are set to revolutionise the way scientists use the world's computing resources in areas ranging from fundamental research to medical diagnosis."
The grid allows the data to be spread between scientific computing centres worldwide. Around a dozen such centres are involved with the first phase of operations, but the number of centres and complexity of the grid will gradually increase as its builders develop an understanding of the complexities involved in building a grid of such unprecedented scale.
It will, in essence, be a prototype for a global grid network that will be available to everyone.
There are smaller-scale grids currently in operation. The SETI@home project is the most famous of these. It offers free software which runs when a participating PC is in screensaver mode, donating the computer's idle processing power to solving the question of whether there is intelligent life on other planets. Results are returned periodically from each PC to a data centre in exchange for new numbers to crunch.
Also, in February this year researchers from the US, UK and Canada, IBM and others launched a joint project using the processing power of millions of PCs to help scientists develop drugs to combat the smallpox virus, seen as a potential biological weapon of terrorists. Again, users participate by downloading a screensaver. The grid analyses billions of molecules in a fraction of the time it would take in a laboratory.
These examples involve breaking big problems into many small problems which can be solved by individual computers. But what happens when the point at issue cannot be reduced into small components?
This difficulty was highlighted at a US congressional hearing on the status of supercomputing in July, which became an argument on the benefits of supercomputers and grid computing.
According to a report on InfoWorld, Vincent Scarafino, manager of numerically intensive computing at Ford, told the hearing that grid computing is not able to carry out every type of analysis; supercomputers are needed too.
Daniel Reed, director of the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign, went further. He told the hearing, "Grids, which were pioneered by the high-end computing community, are not a substitute for high-end computing". He added, "Many problems of national importance can only be solved by tightly coupled high-end computers".
The angst in Congress over which type of computing to focus on is partly the result of losing the top spot in the supercomputing league in March last year. This World's-fastest-computer title is now held by Japan's NEC Earth Simulator, which runs at a record 36 teraflops. (A teraflop is a measure of computer speed that can be described as one trillion floating-point operations per second.)
Concern in Washington was heightened further by an announcement in July that Japan intends to build its own grid-based computer, called the Naregi (National Research Grid Initiative). This is due to be completed in 2007 and boasts an anticipated running speed of 100 teraflops.
Supercomputers do not have to contend with the vagaries of a network. Grid computers, which rely on the sharing of computing files between countless computers, can only be as good as the network on which they run. Projects such as seti@home or the smallpox research grids are internet based, and therefore subject to the traffic flows and attacks that go with all web-based systems.
Other suggested applications for grid computing have included advances in medicine, defence, oil exploration, investment risk analysis, mechanical design and entertainment. The internet lets computers talk together, say the proponents; the grid lets computers work together.