SAN JOSE, Calif. — Server virtualization, the consolidation of physical servers on one powerful machine, has been a galloping trend inside major enterprises. So, it stands to reason that storage virtualization fits in that pattern, given the exploding universe of user-generated digital content.
Storage virtualization was the topic of several sessions tracks here at the IDC Directions conference, the bi-coastal industry briefing hosted by the research firm. It is usually used in a storage area network (SAN)
Data appears in a logical storage space and the virtualization system handles the process of mapping it to the actual physical location. This is very different from server virtualization, which is done primarily to consolidate a number of servers on one machine to improve utilization.
Storage and server consolidation are often done together at a company. IDC’s research found that 85 percent of firms enabling storage virtualization also adopt server virtualization, according to Rick Villars, vice president of storage system research for the firm.
“But they are done in silos, so no one can use the two in combination due to limitations of the hardware and the APIs,” he added.
Storage virtualization is often geared for dynamic storage, disaster recovery and planned downtime, all designed to improve uptime and data availability. But he added, storage virtualization isn’t just for large enterprises any more. Increasingly, it’s becoming a tool for individuals and plenty of consumer-facing applications.
“If you’re Flickr, people expect you to protect their photos forever. If they log in two years from now and they aren’t there, they’re going to be upset. Who’s protecting that?” said Villars.
This is bringing about the rise of what he called Content Depots, like Flickr, YouTube and Google . Villars predicted that sites like those will consume 25 times as much storage space by 2010 as they do now. Great news for EMC. “We have some companies installing 100 terabytes a day, while others are tearing out a 100 terabytes a week. A petabyte is now the entry point,” he said.
A second session reinforced this. Matt Eastwood, program vice president in the enterprise platforms group, said 22 percent of servers today are being virtualized, and that will grow to 45 percent in 12 months. Even more staggering: memory consumption is doubling every 12 months.
Right now, Eastwood projects there are 30 million physical servers, which will rise to 41 million by 2010. Most of those servers are greatly underutilized, running at less than 10 percent of capacity. He estimates there is a three year oversupply over capacity worth $140 million.
But those servers are going to be put to work in virtualized environments, one way or another. John Humphries, program director for enterprise virtualization, said virtualization as a strategy goes through three stages. The first stage is pilot programs that are largely driven by resellers.
Stage two involves a greater expansion of the program as the company is familiar with the concept, and a decreased role for the VAR. Finally, at stage three, integrators take over the process and it becomes a company-wide effort.