NEW YORK — Gartner’s cloud expert and infrastructure guru Thomas
Bittman said that IT departments’ use of cloud and virtualization technologies will enable their companies to succeed or fail.
“IT is a differentiator. IT does matter,” he said in a presentation at the Wall Street Tech Association’s seminar “Cost Effective IT-The Power of Virtualization.” Bittman’s theme played off that of Nicholas Carr’s book Does IT Matter? which claimed that IT expertise is a commodity that will not determine a company’s success or failure.
Bittman said that there’s one bottom line metric that IT organizations can use as a self-assessment: What percentage of the budget is spent on maintaining existing infrastructure versus spending on new projects. He said that he had talked to one organization that was spending 80 percent of the entire IT budget just to stay in the place. The norm, he said, is closer to 60 percent, and such organizations have room to innovate and can eventually reach 40 percent.
Although companies can get better, the laggards risk falling behind in a permanent way. “The gap between the well managed and the badly managed is growing,” he said.
Gartner has a more complex self-assessment tool called the Infrastructure Opportunities Maturity Model, or IOMM, that Bittman said Microsoft has adopted.
The virtualization factor
Infrastructure maturity starts with virtualization. Bittman said that although only about 15 percent of servers are virtualized now, he expects that to grow to over half by 2012. This change will certainly affect the virtualization market. “It will change from one vendor in 2008” he said, undoubtedly referring to market leader VMware), “to a competitive market with one vendor that had a very good foot in the door,” he said.
However, he admitted, some business managers resist the idea of virtualization. Bittman said he knows of at least one case of a company that has implemented virtualization but pretends to the line of business managers that their applications still reside on one server. “They implemented virtualization without telling their customers. They left the stickers on the servers and did not tell the line of business,” he said.
Virtualization certainly makes some things more complex. For example, it breaks software licensing. Bittman warned that some companies will tout virtualization-friendly licensing but that IT managers should avoid these new pricing models as they are likely to cost more. “You cannot price software based on the power of the box it runs on if that software is flying around the datacenter,” he said.
Anyone who subscribes to these new pricing models will be subject to the whims of software providers who do not yet know how to price in the new market and are offering the models as an experiment. “Be a scientist, not a subject,” he said.
The adoption of virtualization will also hurt hardware sales at first. But as organizations finally begin to fully utilize the servers they have, it will eventually drive demand in the x86 server market to heights unseen before, he said.
That’s because virtualization produces greater demand for servers from businesses. In fact, server demand rises so fast that IT administrators will need to keep a very close eye on usage and will need to understand costs.
He said that IT administrators should try to bill the lines of businesses for the costs of services used, but acknowledged that it might not be a practical idea. “Managers are worried about virtual sprawl and need to create friction to prevent it. The decision to deploy a virtual machine must be a business decision.”
He warned that those that fail to control virtualization sprawl will see virtual environments cost more than the client-server deployments they replace.
Be on alert for ‘cowboy activity’
Indeed Howard Fingeroth, vice president of infrastructure architecture at XL Capital (NYSE: XL) said that he brings in his virtualization vendor regularly for periodic health checks to ensure that there’s no “cowboy activity” in his company’s deployment — which consists of 500 virtual machines (and 1,100 physical servers sitting outside the virtual environment) supporting 70 offices in 27 nations.
Bittman said that those that succeed will adopt virtualization for the cost savings but keep deploying it for the agility it delivers. Fingeroth said that virtualization has enabled him to deliver high availability for applications that are not cluster aware and that it had reduced the time required to provision a server from 4 weeks to 2.5 days.
Fingeroth added that virtualization has enabled him to better use storage (his operation includes iSCSI, NAS, and SAN technologies) and better use servers, increasing their life from 4 years to 5 years. Finally, it has allowed him to cut infrastructure costs and reduce the number of system administrators he employs.
Next page: Cloud computing offers a spectrum of choices
Page 2 of 2
Cloud computing offers a spectrum of choices
Those who deploy virtualization will then go to the cloud, Bittman said, but they will do so in an evolutionary manner over the long term. “Nicholas Carr said that it would be a slow switch except in the title of his book, The Big Switch,” he said.
Bittman further noted that cloud computing is a spectrum of choices between on premises and hosted, between generic and customized — and other variables.
One factor hindering cloud adoption is interoperability and the lack of open standards, a situation not likely to be changed by the Open Cloud Manifesto. “It was
well-meaning but key players were not part of it. The cloud ecosystem is proprietary islands and standards initiatives are driven by the little guys who need to federate to compete, but the little guys want to differentiate too and may not want open standards,” Bittman said.
Without open standards, most of the money in cloud computing will be made supporting private clouds, “but the private cloud is a stepping stone, not a destination,” he said.
He added that in the long term, organizations will have dozens or hundreds of clouds, each delivering one service, and integrators, who currently focus on supporting hardware and servers, will become service brokers.
IT organizations that thrive in this environment will know the cost of every service and will charge lines of business for what they use, he said.
He noted that line of business managers already have the option of paying for cloud services, and those public cloud prices will compete with an IT organization’s private cloud.
When choosing vendors, IT managers should not assume that the largest cloud will win. “The market will be Darwinian. It will consist of giants surrounded by smaller, hungrier providers,” he said.
The market is still very immature. If you want to do something you couldn’t do within your organization — such as run a batch process on a million servers for a minute — you cannot because the cloud providers cannot yet deliver that service. “Amazon is chunky,” he said. “If you want more than 10 servers, you need to call them. You cannot order that on the Web.”
He concluded that it is already clear whose private cloud will be the biggest. “The government will spend billions of dollars per year and will have the biggest private clouds,” he said.