The Grid Pushes Towards the Mainstream

As a variant of the “infinite amount of monkeys” theory, Grid Computing will not eventually help produce the complete works of William Shakespeare, but it might just solve some of the world’s problems.

Once a tool of the government, university and medical communities, mainstream companies have climbed onboard to make money off of distributed computing.

Overall, grid computing has taken collective advantage of the vast improvements in microprocessor speeds, optical communications, raw storage capacity, the Web and the Internet that have occurred over the last 5 years.

The issue was discussed in great detail at the Grid Computing Planet Conference & Expo in San Jose, Calif. this week as well in “The Global Grid Computing Report 2002 – Technology and Market Opportunity Assessment,” a new report released this week.

Both the conference and report were presented by INT Media Group , the parent company of Internet.com and this site.

“Grid” is a type of computing in which different components and objects comprising an application can be located on different computers connected to a network. So rather than using a network of computers simply to communicate and transfer data, grid computing taps the unused processor cycles of numerous — sometimes thousands of — computers.

One highly publicized project is the SETI (Search for Extraterrestrial Intelligence) @Home project, in which PC users worldwide donate unused processor cycles to help the search for signs of extraterrestrial life by analyzing signals coming from outer space.

Now, several companies have climbed onboard to make money off of the technology.

IBM , Sun Microsystems and Compaq Computer (under the management of Hewlett-Packard ) have been the best-known companies at the forefront of the technology.

Private companies like Avaki, Data Synapse, Entropia, GridFrastructure, Noemix, Parabon, Platform Computing and United Devices also have a big hand in shaping the sector.

However, grid computing faces obstacles on the way to a bright future.

But some, like IBM Grid Computing General Manager Tom Hawk, see Grid’s promise being fulfilled “much faster than people realize.”

IBM’s ultimate vision for Grid is a utility model over the Internet, where clients draw on compute power much as they do now with electricity. With more than 60 percent of IT budgets dedicated to maintenance and integration — a percentage that Hawk says continues to rise – the need to reduce complexity and management demands is a pressing one.

Call To Action

At the forefront of grid-work in action is the Globus Project — a multi-institutional research and development effort creating fundamental technologies for computational grids.

A primary product of the Globus Project is the open source Globus Toolkit, which is being used in numerous large Grid deployment and application projects in the United States, Europe, and around the world. The Globus Project is based at Argonne National Laboratory and the University of Southern California’s Information Sciences Institute.

In addition to Globus, there is the U.K. National Grid, the Netherlands National Grid, The Tera Grid (hosted by the U.S. National Science Foundation), the Univ. of Pennsylvania Grid, the North Carolina BioGrid, and the Department of Energy Science Grid.

On the commercial side, the Univ. of Pennsylvania Grid, and the National Digital Mamigraphical Archive (NDMA) are allocating time and resources to private enterprise.

But as Hawk sees it, the main driving factor in getting businesses involved with grid computing projects is the cost factor — mainly the Linux operating system.

“Linux has that cheapness factor,” said Hawk. “It’s amazing how technology has adopted open standards and open source from systems management to serving. Complexity has been too expensive.”

As for the possibility of conflicting projects, Hawk says certain workloads are more germane than others and companies working on the grid will work those problems out with the establishment of the Global Grid Forum (GGF) and service-level agreements with the grid operators.

The GGF is a standards body forum of individual researchers and practitioners formed as the result of a merger of the Grid Forum, the eGrid European Grid Forum, and the Grid community in Asia-Pacific. These initial forum efforts included five major workshops in the US in 1999 and 2000 and two in Europe during 2000.

It’s that establishment of cross-environment standards that many say makes grid computing the perfect arena for Web Services.

Continues on page 2 with “The Grid Is Key For Web Services”

The Grid Is Key For Web Services

Because of the cross-environment nature of Web Services core languages like XML and SOAP, many of its players are seeing the grid as a way to serve up services without the customer getting worried about the details.

“There are three critical ways that the grid helps,” said IBM VP of Technology Strategy Irving Wladawsky-Berger. “The first is standards. The second is that systems must become self-managing, and the third is access to services. If I’m making a pizza all by myself, it may take a lot of time. But if I go to the pizza store, then it’s not that complicated. Sometimes I like to make the dough; sometimes the sauce and I can go and buy those things. In that I become the kitchen integrator. As there is an agreement on the protocols the important part here is that you the users can access the tools be that virtual service it may be in your own department or all the way out in a utility on the grid.”

Sun Microsystems , which released its Grid Engine software in September 2000 says it plans on advancing its Web Services strategy with it. Sun’s current Grid Engine customers include Ford, Caprion Pharmaceuticals, BP, Cognigen Corporation, Motorola, Sony Semiconductor and Devices Europe.

IBM is involved in several grid computing projects. For example, In November 2001, IBM announced it is building a computing grid for the University of Pennsylvania designed to bring advanced methods of breast cancer diagnosis and screening to patients. Big Blue is not shy about saying its grid-enhanced solutions help cut workload management and dynamic reallocation of recourses.

The company says its clients are very interested in managing their resources more effectively given the wake-up call that many businesses faced after the events of September 11.

“Enterprise is looking for resiliency after 9-11 as a way to push workload or change functions without too much human intervention,” said Hawk.

But, perhaps one of the best examples of how enterprise is driving the grid is in the arena of gaming.

Grid for Gamers

Grid computing is also establishing itself in one of the Internet’s main moneymaking business models: online gaming.

The market is currently only a small percentage of that industry, though it is growing. The Korean massively multi-player online roleplaying game (MMORPG) Lineage boasts more than 2.5 million subscribers, and EverQuest, which is notorious for players who use online auction sites to sell characters and equipment from the game world, is reported to have a virtual economy that makes its setting the 77th largest economy in the world.

The Butterfly Grid is the first grid system with the capability of processing online video games across a multicast network of server farms, allowing the most efficient utilization of computing resources for high-performance 3D immersive game-worlds

Its no surprise that the grid is being heralded as a perfect storm for online gaming. A recent survey on the Internetnews.com Web site ranked online gaming as the number one driver of growth of commercial grid computing

“I wouldn’t say the grid is changing the landscape of the Internet as much it is exploiting it,” said Hawk. Now that there is the backbone of the Web, the Internet 2 and expansion of bandwidth, the key here is to lash together these systems and help it evolve to its next level.”

Grid Computing Planet editor Paul Shread contributed to this report.

News Around the Web