Open Source Will Help Advance of Utility Computing

SAN FRANCISCO — Open source software will play an important role in the inevitable move to a utility model of
computing, a noted observer said.

Speaking at the Open Source Business Conference here, Nicholas Carr
compared today’s data centers to the individual power plants companies
maintained a century ago to ensure a ready supply of electricity before the
creation of large utility companies that made the resource more readily
available — and far less expensive to purchase and maintain.

Carr, the author
of a controversial 2003 article called “Does IT Matter?” in the “Harvard Business Review” and the Rough Type blog,
said that utility computing represents the next wave of true networking
computing, but the technology and business models still need to evolve for
it to be successful.

The basic idea of utility computing is to make computer and software
resources available on an on-demand or as-needed basis from a central
server. No matter what the licensing model, proponents of
utility computing argue that the resources will be more efficiently distributed
than they are now.

“I think we will see a fundamental shift in the way this critical source
is supplied,” said Carr. He quoted studies by HP and IBM
that indicate gross inefficiency in the current client/server model, with servers typically getting 10 to 35 percent utilization; PCs typically get 5 percent utilization; while storage devices see 25 to 30 percent utilization.

Commercial proprietary software also limits a company’s ability to
customize, tying it into the vendor’s upgrade cycle, waiting for new
features which may or may not satisfy their needs, Carr said. Open source applications,
he noted, bring more flexibility to companies, giving them access to the
underlying code and the ability to add and extend features.

“Growth in open source software in the enterprise is inevitable,” said
Hal Steger, vice president of marketing with Scalix, which offers a
Linux-based enterprise e-mail and calendaring application. “We get the kind
of instant feedback a proprietary software vendor isn’t going to get.”

The Scalix software is a hybrid offering of both proprietary and open
source code. Steger said Scalix has benefited greatly from customers that
offered suggestions and helped extend the open source parts of its software.

“I see a mix of open source and commercial software [driving utility
computing], and it’s not clear open source will win. But the main thing is
that IT will get cheaper, more flexible services with open source in the
mix,” said Carr.

Open source also promises to give IT staffs the ability to better
differentiate their IT structures for competitive advantage. Carr said
60 percent of IT labor is devoted to routine support and maintenance
rather than to leveraging the resource for competitive advantage. This
routine maintenance is also a drain on management.

“In nearly every company, [the CIOs] are spending 70 to 90 percent on an
undifferentiated infrastructure and only about 20 percent on innovation,”
said Carr. “What you really want is to have those numbers flipped around.”

While utility computing is touted by Sun, HP, IBM and others who have
pilot projects underway, Carr said there needs to be more clarity of the
concept and how it works before significant adoption can be expected. He
noted, for example, there is confusion about just what utility computing is. Related offerings are steeped in jargon: grids, adaptive
computing, virtualization, organic computing, SaaS, autonomic systems,
on-demand, and SOA.

“Many IT vendors pay lip service to on demand and utility computing, but
their business model would be turned on its head, if pricing of IT was tied
to its true usage,” said Carr.

Get the Free Newsletter!

Subscribe to our newsletter.

Subscribe to Daily Tech Insider for top news, trends & analysis

News Around the Web