Sun Microsystems and IBM
are once again going head-to-head in their efforts to lure
new customers. Both companies have announced plans to reshape their
high-performance computing offerings through the formation of new business
units.
Sun earlier this month announced the creation of its High Performance and
Technical Computing (HPTC) unit and IBM this week unveiled its Deep Thinking
unit. Both firms want use these ventures to rein in their intensive
computing efforts by better defining them to serve customers. They believe
creating HPTC systems that draw resources from their software, services,
sales and storage divisions should help them do that.
HPTC is a market segment that uses complex, large-scale computers to perform
any number of tasks from gauging weather patterns to running tests in life
sciences organizations. Grid computing
application of many computers’ resources to a single problem at the same
time — is one facet of the HPTC segment. Until now, both Sun and IBM have
been rolling out separate computer systems piecemeal, but by integrating
them with their other hardware and software product offerings, they hope
they will be better able to market these high-priced systems to prospective
customers in need of serious number crunching.
Better visibility, they hope, will lead to bigger contracts with educational
institutions, government offices, and even financial organizations, in a
time when competition is cutthroat and other competitors such as HP, SGI and
Cray are hungry for pieces of the multi-billion-dollar HPTC pie.
Ed Broderick, principal analyst at the Robert Frances Group, said the move was an obvious one, and that the close timing of the announcements is an example of the “me, too, me, too” competitive stance the firms have struck up over the years.
“They are focusing on a new opportunity, a new initiative to bring their skills and power into one organization for HPTC,” Broderick told internetnews.com. “This is for synergy and they’re saying ‘let’s exploit it.'”
Sun discussed its HPTC plans in a call with reporters and analysts
Wednesday. Clark Masters, Executive Vice President of Sun’s Enterprise
Systems Products and Shahin Khan, newly-anointed Vice President of the HPTC
unit, highlighted where they see areas of growth and expressed confidence
that HPTC will thrive despite some evidence that low-level Linux clusters
are eating away at the high-performance market.
Masters, acknowledging the trend of low-level Linux clusters eating at HPTC
market share, said Sun isn’t concerned because it has been engaged in
contract talks with several government organizations, especially at the
intelligence level, who indicated interest in HPTC systems.
Masters said research and development is underway to get the Java
is not currently equipped to handle. Sun’s labs are experimenting with this,
among other HPTC challenges, and Masters said Java Grande was created to use “Java
as more than just a wrapper.” He also advised the public to expect an
“aggressive and complete Linux offering soon,” but didn’t stray farther than
that.
As for competition, Masters entertained a question about how IBM’s Regatta
stacks up against Sun machines in the HPTC department, pointing out that Sun
has better input/output bandwidth and processing, while IBM enjoys better
floating points per CPU to Sun’s machine.
Masters also said IBM paid Sun a compliment by ratcheting up their own HPTC
interests as Big Blue this week announced the creation of the Deep Computing
business unit, which, like Sun’s HPTC, is intended to link together the
company’s hardware, software and services offerings for intensive computing
projects.
This group will be led by Dave Turek, who most recently focused on grid
computing and Linux clusters. The Deep Computing team will work with IBM’s
research groups to connect relevant offerings from throughout IBM’s
portfolio. Turek said one of the chief reasons for the creation of Deep
Computing is the potential for tapping areas such as cell phone design,
medical simulation, Hollywood animation and fraud detection.
Turek, now vice president of the Deep Computing division, dismissed the
competitive argument and the same-month timing of the announcements, saying
that no other firm can match the scale, scope and breadth of what IBM has to
offer.
“One can go into battle with a cap gun and one can go into battle with a
bazooka,” Turek told internetnews.com. “Each has a gun, but which one
is going to win?”
Through all of this, Turek said IBM wanted to steer the perception of HPTC
away from the traditional server-only line of thinking, and align its
strategy to better fit with the company’s overarching on-demand computing
strategy. To be sure, IBM has already announced
supercomputing on-demand.
IBM’s announced Deep Computing amid its pledge of support for AMD’s Opteron
architecture, a launch that has Sun equally excited about its possibilities
for HPTC.
Sun’s Kahn offered a frank assessment of the server chip space,
vis-á-vis AMD’s Opteron announcement, saying that it further dilutes
Itanium’s entry into the server market. He said 32-way architectures have
been established, while the 64-way is still up for grabs.
“Right now, there is Itanium, Opteron, and what Intel calls Yamhill, so
there are three different sets of instructions [for server vendors],” Khan
said. “Intel has the right company but the wrong architecture, AMD has the
right architecture but the wrong company,” and it remains to be seen what
Yamhill is. Khan said the combinations of Solaris and x86 allow
Sun to go “from smart card to supercomputer.”
Meanwhile, IBM is developing Blue Gene/L,
a supercomputer scheduled that will be used to simulate events such as fires
or materials aging, among many other large supercomputer contracts it has
picked up in the last two years.