Enterprises virtualizing their infrastructure to help cut costs could be in for a shock — system performance may be degraded and they may not save as much money as they had hoped, unless they manage the performance of their business-critical applications.
But that management is difficult because of a lack of tools that enable companies to look in on a virtualized environment and see what is going on, according to a study by Bojan Simic, an analyst at research firm Aberdeen.
The study, which surveyed 137 enterprises in December, found that companies do much better in
improving performance for business-critical applications when they’re in a physical environment, rather than in a virtualized environment. That’s because they lack the adequate tools to do so under virtualization, Simic told InternetNews.com in an e-mail.
“If organizations don’t have capabilities in place for the effective management of application performance, some of the benefits of virtualization could diminish,” Simic said.
As a result, Simic’s findings point to what may be a dark underside to a rare bright spot for IT: spending on virtualization is expected to grow in 2009, thanks to the idea that the technology can help reduce datacenter costs while improving efficiency and utilization.
But the study found that keeping business-critical applications at peak performance may actually be harder in virtualized environments. For example, when using tools to manage business-critical applications in physical environments, 62 percent of the enterprises surveyed by Simic reported improved mean time between repairs (MTBR). That figure dropped to 32 percent in the virtualized environment.
Simic found that enterprises had an 85 percent success rate in identifying performance issues before they impacted end users — but only in setups that relied on a physical environment. The rate for virtualized environments was 37 percent.
And 67 percent of respondents said they saw improved application response times when managing business-critical applications in the physical environment, while the figure for virtualized environments was 39 percent.
Wanted: Virtualization-friendly tools
Companies that have begun virtualization projects have complained about the lack of adequate tools — prompting a recent flood of new offerings from major players in the system management space, such as CA (NYSE: CA), IBM (NYSE: IBM), Hewlett-Packard (NYSE: HPQ) and BMC (NYSE: BMC).
But Simic said customers are reporting that many of those products don’t do a good enough job.
“Even though a lot of systems management vendors have capabilities for visibility into the virtual infrastructure, that is not enough to give you full visibility into application performance in virtual environments,” Simic said in an e-mail to InternetNews.com. “End users need capabilities that go beyond server monitoring and focus on application monitoring.”
That inability to look into virtualized environments means enterprises aren’t able to fully understand what is going on in their systems. For 55 percent of the survey’s respondents, this shortcoming represents the main obstacle to optimizing application performance in virtualized environments.
Virtualization management tools providers agree that tackling the issue is difficult, simply given the nature of virtualization.
“By decoupling the application from the hardware and having parts of it reside in virtualized environments, virtualization has changed the way we manage applications,” Mike Lough, vice president of marketing at virtualization management tools vendor BlueStripe Software, told InternetNews.com.
That lack of visibility leads to another problem, one that virtualization was designed to solve — the increasing need for hardware and storage resources.
One of the key benefits behind server virtualization had been that it enables a datacenter to consolidate a large number of servers down to just a few, thus requiring fewer resources. Virtualization also promised to enable enterprises to pull up a new virtual machine during peak times and turn it off at other times. That would reduce the problems of overprovisioning and underutilization, since businesses always had to stock spare servers to meet the demands of peak times — but then these servers would lie idle during off-peak hours.
Because of the lack of visibility into applications in the virtualized environment, however, overprovisioning and underutilization are becoming a problem again.
“With the lack of visibility, businesses can’t see what the problem is in performance, but they know they have a performance issue in
one aspect or another,” BlueStripe’s Lough said.
“So they put in another server or throw more resources at the problem until it goes away. That leads to overprovisioning, which is completely contrary to the promise of virtualization.”