Analysts Weigh Strategies of On-Demand ‘Hydra’

Representatives of research firm Summit
Strategies
Tuesday outlined the virtualization aspect of on-demand
computing, and weighed the pros and cons of HP’s, IBM’s and Sun
Microsystems’ respective strategies for that ideology.


The analysts took time to discuss how virtualized computing — the practice
of automating consolidation and pooling of IT resources in a way that allows
them to be managed as a “pool” of server, storage and networking capacity
that can be allocated on the fly — may be used by IT customers to save
money. For Summit Strategies, virtualized computing fits under the broader
umbrella of utility computing, according to the analysts.


But the call might be better remembered for the analysts’ discussion of how
the major competitors stack up against one another right now. Summit
Strategies President and Chief Research Officer Tom Kucharvy and Senior
Analyst John Madden took time to explain that there currently exists a
confusing number of terminologies for the notion of utility or on-demand
computing, which is the idea of computing resources that are piped as needed
to accommodate complex, evolving businesses.


Madden pointed out that every major vendor has different names for it: IBM
loosely uses the term on-demand computing but describes its framework as the
Utility Management Initiative; HP calls its strategy “adaptive
infrastructure,” but their broader canvas is the Utility Data Center; Sun
calls their strategy N1; and newcomer Microsoft uses the phrase Dynamic
Systems Initiative to describe the harnessing of computing resources on the
fly.


Currently, IBM, HP and Sun are all offering some of these characteristics
from their respective plans. IBM features many servers with virtualization
(including Linux-oriented zSeries mainframes); HP features a virtualized
server; and Sun recently rolled out virtualized blade servers under the
auspices of N1.


Kucharvy detailed strengths and weaknesses he could discern among HP’s,
IBM’s and Sun’s utility computing strategies, calling the sector largely “hydra-headed” because, though other firms are embarking on similar ventures, those three are leading the race.


“Sun’s N1 has a nice technical vision. Overall, it is most compelling for
corporate CTOs,” he said. “However, it has limited services, software, a
late start. There is also the question of whether they can make the huge
financial investment required.”


Kucharvy said HP has the technology skills in place, as well as the business
skills and ability to measure ROI, but that it’s marketing prowess is weak
in this vein and it tend to overrely on partners. Still, UDC is attractive
to CIOs and HP’s hidden trump card may be its OpenView software platform,
which could give them strong market share in utility computing.


IBM, Kucharvy said, has the distinction of making on-demand ubiquitous
throughout its entire corporate strategy. He said IBM has the integration
strategy, reputations, marketing, software, services, and ISV program to get
it done, but that its
vulnerability is trying to be everything to everybody in the bet-the-company
strategy might be too “mushy.” This on-demand foray, he said, is attractive
to CEOs.


But there was also discussion of Microsoft’s new Dynamic Systems Initiative,
which Kucharvy said is not only late to the game, but extremely vague.


“Microsoft’s DSI has got a lot of components, they are trying to integrate
new management tools, and really support it through Visual Studio .NET,”
Kucharvy said. “But Microsoft’s is late to market and seem to be falling
into the same problem they did with .NET two years ago of not really
outlining a clear strategy. Microsoft will need to take explain their vision
to partners, and get something to market.”


Speaking of ambiguity and confusion, Kucharvy said: “There is a desperate
need for a standards lexicon. Unless a major vendor tries to impose one, it
will be hard to see how customers adopt [utility computing].” He suggested
IBM might have the wherewithal to tackle this issue.


Madden echoed that, noting that proprietary lock-in from the lack of
standards to go along with the terminology is another chief detractor from
keeping this ideology of on-demand from reaching full fruition.


Madden also explained that the idea of virtualized computing is not new,
having long been used to describe a form of data management in storage, or
even IBM’s logical partitioning capabilities in its servers: the practice
is, however, expanding to data centers.


Madden said virtualized computing currently addresses pain points that haunt
IT companies in a financially-constrained economy by cutting into IT capital
costs and operations costs; improving poor reliability and availability; and
limiting operations disruption. One obvious example of the cost-cutting is a
virtualized server that does the job of multiple servers.


Madden said the goal of these firms is to automate the computing process,
and then make them as simple as possible. To do this, he said, virtualized
offerings will have to work across various systems, and there will have to
be heterogeneous products from vendors. At present, Madden said there exists
only virtualization internally, for their own platforms. This will have to
change, he said, to ensure openness, thereby staving off vendor lock-in.

Get the Free Newsletter!

Subscribe to our newsletter.

Subscribe to Daily Tech Insider for top news, trends & analysis

News Around the Web