HP: UDC Alive and Well


When HP recently announced changes to its Utility Data
Center (UDC), it was widely reported that the company was fleeing its
utility computing strategy and throwing up the white flag in defeat.


Nothing could be further from the truth, said HP Utility
Computing Manager Jay Chitnis, even HP repositioned staffers within the
organization as a result of the change. In a departure from its pre-wired and
prepackaged Swiss-army data center, HP is turning toward a software-based
approach to fill customers’ needs, he said.


This is something rivals IBM , Sun Microsystems and Veritas Software
have all been pushing with their on-demand or utility
plays, where computing resources are procured by customers “by-the-drink,”
or on a pay-for-use basis.


The term “utility” has worked its way into the IT lexicon, borrowed from the
word used in describing water, gas and electric heat from utility companies.


Chitnis told internetnews.com the redefined UDC comprises a blade
system as the hub, a virtual server environment to automatically grow and
shrink server and storage resources in real time, change and configuration
management software and managed services.


That’s a considerable departure from its previous, hardware-centric UDC
offering. That included a management rack, a network operations rack, linked
racks of servers, capacity planning and optimization software, as well as a
storage array.


Chitnis said customers told HP they needed to start smaller in their utility
computing planning.

“This is our third or fourth generation of UDC,” Chitnis
said. “We have learned a lot from our customers, and they have told us they
want the benefits of UDC but in a more modular fashion.”


This means customers no longer have to buy the entire package, the executive
said. For example, users may choose to start with a blade system, which
includes a single console for configuring and allocating resources from a
virtual pool of server, storage and networking — all on a pay-per-use
basis. They may then choose to add other components as their needs grow,
enabling their infrastructure to contract or flex as required.


Analysts Weigh In


Forrester Research Principal Analyst Frank Gillett said that, while UDC
seemed like a great idea in 2001 when the company formally introduced it,
the market was not ready for the rip-and-replace proposal.


Gillett said one of the great obstacles the former UDC instantiation
couldn’t overcome was the minimum entry price of more than $1 million. The
analyst said the price went up from that base element if customers added more storage gear
and servers.


The series of racks would then control everything within and
external to the rack if it was on the compatibility list and if the devices
were rewired through the rack.


“That was the gotcha,” Gillett told internetnews.com. “No. 1, it was
a giant purchase. No. 2, you were betting on a bunch of HP proprietary stuff
that was not productized other than in UDC. No. 3, if you wanted the thing
to do all its cool magic, you’d have to rewire your data center through it.


“It was quite visionary but it didn’t match people’s
needs or comfort levels,” he continued. “[HP] only got a handful of sales. To us, moving
towards a software strategy makes a lot more sense. We think the software
incremental approach is going to be much more successful in terms of what
people are willing to buy in terms of size.
People are willing to buy a little bit of time. Trying out a $50,000 or
$200,000 software package is a lot less scary than buying $1 million worth
of iron. That was the fundamental problem that UDC ran into.”


Gillett and his colleagues at Forrester see the next generation of computing
in something they call Organic IT, which chiefly addressed the problems of
underutilized systems, labor-intensive processes and inflexibility.


This embodies many of the characteristics associated with utility computing
today, including pooled, virtualized resources and automated management.
Those technologies are great on the surface, but, Gillett and Forrester
argue, the trick lies in getting them standardized, abstracted and
integrated within vendors products.


Luckily for HP, it has a new software focus, underpinned by the company’s
Adaptive Enterprise strategy. Like IBM’s on-demand vision, HP aims to
accommodate changes in business with more intelligent, real-time IT
infrastructure.


“Adaptive Enterprise is about how we allow customers to take advantage of
technology to better address their business needs,” Chitnis said. “Utility
computing at its core is all about paying as you go. And sharing a set of
computing resources with others is one way for companies to be adaptive —
the notion of being able to pay based on usage. At the end of the day, IT
is viewed much more as a service provider rather than a cost center.”


Because it is such a nascent market with a lot of unknowns, adoption rates
of utility computing are difficult to gauge. Early this year, IDC took a
stab at estimating the growth path of the market, noting that, while its
customers spent $1 billion in 2003, it expected that amount to double in
2004 and increase to $4.6 billion in 2007.


But for that to happen, vendors’ product lines have to better coalesce
around their visions. Donna Scott, vice president and distinguished analyst
at Gartner, said few products means few customers, which equals slow
adoption.


“None of the vendors have much real product in terms of real-time
infrastructure, meaning that even those that are capable of varying the IT
infrastructure dynamically, have very few customers,” Scott told
internetnews.com.


Scott said that while the “rip-and-replace,” or “another repository to be
integrated,” fears have stymied utility computing growth, “IT operation is
based on monitoring a stable environment, and none of the other management
tools have been changed to enable monitoring of a dynamic environment. How
will they know when there is a problem or not?”


Shining Sun


Statistics on utility computing are few and far between, but that doesn’t
prevent HP rivals IBM, Sun and Veritas from wanting a slice of what is
potentially a nice-sized pie.


While IBM refused to comment for this story, the company has been no less
busy ramping up its on-demand computing endeavor and will celebrate its
two-year anniversary next week at an event in New York.


Like HP, IBM is aiming to bring real-time software to the fore. The company
stands firmly planted in the best of both worlds, because its new Power5
servers are optimized to handle heavily virtualized, on-demand environments.


Sun has its own special approach, according to Ashif Dhanani, Sun’s director
of utility computing.


With N1 Grid Engine software running on its own servers, Sun is selling
its services in unconventional ways. This includes a new $1 per hour per CPU
pricing model and a pay-for-use Sun Utility Computing for Midrange Sun
StorEdge Systems, starting at 80 cents per Sun Power Unit per month.


Designed as a grid computing infrastructure, Dhanani said the concern’s
pay-per-use offering is tooled for markets that require high-performance
computing, such as financial services, digital content creation and life
sciences.


Dhanani offered his two cents for why the recent UDC didn’t pan out.


“There was no right or wrong approach when they started this — but the
approach they took was to get the problem solved in one shot,” he told
internetnews.com. “And the challenges you have in the [utility computing] space is
that customers don’t have a clue of the utilization rates by system, by
project, by user, by application. So they’re really petrified if you say
I’ll take care of everything and send you a bill. Because they have no idea
whether that bill is going to kill them or not.”


Dhanani said Sun has an advantage having an operating system, Solaris, that
works in concert with N1 Grid software to power anything from the company’s
$2,000 machines to its $4 million systems machines with N1 Grid software.
The executive used a restaurant metaphor to describe the differences between
Sun’s current play and HP’s former approach to utility computing.


“You’re at a buffet from HP. You pay $6.99 and you can fill up on whatever
you want from the line. What happens in the buffet is the shrimps go fast,
the steaks go fast, and some of the stuff goes stale on you. But you have to
price it in, right? So it’s the case where you’re pricing an average set of
ingredients but you don’t know what the unified vision is going to look
like.


“What we have is a menu system, where you sit in a restaurant and say I want
the French fries and the cheeseburger. Those cost elements are very easy to
manage. We can do on-time production of that and get it out to you. But on a
buffet line, some stuff gets overused and underused, so you have a challenge
both in flexing infrastructure and getting stuff on time, all the time.”


What’s next for Sun in its utility computing strategy?


Dhanani finished the metaphor: “We’re going to work toward combo meals,”
he said, laughing. “We’ve got a long way until we get towards buffet
style.”

This, he said, is a pure software subscription model.

Get the Free Newsletter!

Subscribe to our newsletter.

Subscribe to Daily Tech Insider for top news, trends & analysis

News Around the Web