RealTime IT News

Supercomputing On-Demand Finds Home at IBM

IBM Tuesday said it has completed and opened its first facility for delivering supercomputing power to customers over the Internet as their business needs dictate it.

Part of the Armonk, N.Y. giant's overarching e-business on-demand computing strategy, the new Deep Computing on demand facility is located in IBM's Poughkeepsie plant, where the original mainframe computer was invented more than 30 years ago. Big Blue considers its new Deep Computing division a crucial unit in its endeavors to curry favor over on-demand computing rivals HP, Veritas and Computer Associates.

The system currently consists of a cluster of IBM eServer xSeries Intel-based Linux systems and pSeries UNIX servers with disk storage, said Dave Turek, vice president of IBM's Deep Computing division. Turek told internetnews.com customers may access the center via a secure virtual private network connection from anywhere in the world.

Turek said the impetus behind supercomputing on-demand is that some companies are faced with unexpected projects that require a high level of computing power. These projects are often temporary -- sometimes just weeks or months -- but pop up quickly, forcing IT managers to swiftly make important decisions about how they want to handle the new business requirements.

Oftentimes, Turek said, a company might buy a costly supercomputer that is more than adequate for such a task, but then sits idle when the job is complete. This translates to wasted dollars and resources at a time when the former is scant and the latter is precious. He said IBM's value proposition is to provide a solution that addresses these impromptu projects.

"If you think about it, for every company in the world too little or too much computing power is a bad thing," Turek said. "As projects come in, if there is too little power it makes it difficult to juggle priorities. Life may go on, but it's not uncommon for things that exist in priority 1 that don't exist in priority 2. Deep Computing on-demand helps customers build their business accommodate that situation. It can really mitigate financial and technological risk."

Turek said IBM has inked a contract with Houston's GX Technology, a full-service contractor for the oil and gas industry that produces high resolution subsurface images from large amounts of seismic data. GXT is getting power piped from an IBM Linux cluster to meet its performance requirements. The company said IBM's Deep Computing on-demand service will help it expand the scope and the number of projects it can handle around the world.

Mick Lambert, president and CEO of GX Technology, said companies exploring for oil and gas reserves use GXT's high resolution subsurface images to significantly reduce their drilling risk.

"Shortening project cycle times allows our clients to gain a significant additional benefit from our services," Lambert said. "IBM's Deep Computing on demand gives us the power to dramatically reduce project cycle times and increase our project capacity, while reducing infrastructure and operating costs."

Other firms that might require such resources peaks and valleys in computing demand include Hollywood studios that use supercomputing power to create animated movies as well as life sciences companies for genomic and drug discovery research. Financial services organizations, government agencies and national research laboratories are also likely customers.

Turek also said the facility is expected to employ blade technologies and different microprocessor architectures, such as its own Power4+ line and AMD's Opteron, over time.

The announcement comes during the same week organizers of the Top500 list revised their rankings of supercomputing market share. IBM's total supercomputing systems account for 130 Teraflops of power (trillions of calculations per second), representing more then 34 percent of the total processing power on the list.