Storage Change Leads to Big Headaches

Continued strong growth in data and the emergence of new ways to handle it will lead to major change and disruption in the storage market in coming years, industry analysts and experts predict.

The explosion of data — and the rise of laws and regulations governing its retention — have spurred new technologies such as information lifecycle management (ILM) for managing it. But with those new technologies have come new problems.

David Scott, president and CEO of 3PAR, says the challenge for users will be in managing the multiple distinct tiers of storage required to handle all the different categories of static and dynamic data.

“If every tier is stored on different platform architectures, each with unique data management tools, training requirements and separate, uncoordinated development paths, the storage administration and change management toll on customers will be extremely heavy,” he says. “This will drive up the total cost of ownership just as the economic imperative is to drive it down.”

Scott says that this is not alleviated by the introduction of storage resource management (SRM) tools that simply introduce another layer and component. “I believe that a new generation of simple, efficient, and massively scalable tiered-storage arrays called utility storage can help customers escape this trap,” he says.

Scott also believes that with utility storage, tiering can occur cost-effectively within a single platform, with low administrative overhead and common data management facilities across all tiers. And, he says, costs associated with complexity and change management can be almost eliminated. “With utility storage tools such as dynamic optimization, data can be moved between tiers simply, online, and without disruption to running applications,” says Scott.

Robert Skeffington, solutions specialist at Dimension Data, believes that having more data doesn’t present new problems; it just compounds the ones you already have.

“The lack of a good, quick to use, comprehensive management tool is making each day a little more painful,” says Skeffington. “The pressures on the IT department from both the internal and external financial communities, and more importantly, the bell on Wall Street, will remain the same: keep your costs down and improve your efficiency ratios.”

Skeffington says that more data generally requires more people to manage, and more people managing drives up costs.

“A good management tool moves the routine and/or repetitive tasks from the human to the tool, and this is true at any level of storage management,” he says.

Virtualization, Partitioning Loom Large

Virtualization, partitioning and wide-area access to storage are technologies that are gaining favor as solutions for easing storage management burdens.

Paul Schoenau, a senior product marketing manager at Ciena, believes it is critical to have ubiquitous access to storage regardless of where it is located and what type of storage it is. To create these enterprise-wide pools of resources, it is essential to consider the impact of the MAN/WAN network on the overall application.

“The MAN/WAN network must be designed to minimize the latency and interact with the storage protocols to account for delay distance between these pools of data,” said Schoenau.

Scott believes that fine-grain virtualization implemented within a storage array allows dramatic improvements in ease-of-use, utilization efficiency, internal tiering, performance and availability. “Heterogeneous virtualization, implemented in network-based appliances, allows inter-vendor replication and the data migration activities associated with storage asset retirement,” he says.

Scott believes that combining those approaches in a next-generation storage deployment architecture can provide a comprehensive solution. “However, attempts to extend either platform to deliver the benefits best associated with the other can actually weaken data availability or unnecessarily increase the overall cost of ownership,” he says.

According to Scott, partitioning, since it allows one to offer different quality of service levels to specific customers or applications, is useful both for internal and external IT service providers. “I believe that using fine-grain virtualization for partitioning, as opposed to the weaker alternative of hard physical partitioning, offers both higher utilization rates and a lower total cost of ownership,” he says.

Skeffington believes that the mainframe folks have had a handle on the solution longer than the “open” side has been in existence. “Let’s take a page out of their book,” he says. “The lack of the correct components in the open world is driving up management costs, again, because there is no tool.”

Continuity, Availability Become Critical

Today’s client-facing applications make round-the-clock availability and comprehensive business continuity critical corporate practices.

Scott says the need to increase productivity and win in a competitive marketplace will drive organizations to maximize the return they can get from the IT resources they can afford. In the process, they will inexorably become more dependent on those assets.

“The pressure to deliver increased flexibility and appropriately matched service levels at the lowest possible cost will rise,” says Scott. “Many organizations will be willing to take carefully judged risk to try new solutions that deliver these capabilities at levels which existing platforms cannot match. Only in that way will they be able to maintain leadership against competition in their own fields.”

Schoenau says the pressure to deliver availability and performance, reduce costs, improve service and maximize risk is growing.

“In that regard, remote networked storage over a secure 99.999% available network becomes an essential component of the solution,” Schoenau says. “The BC/DR applications must be designed with the right high performance to meet these availability objectives, yet enterprises still view these solutions as insurance. Therefore, it is critical to ensure that the optimal solution is put together that minimize the MAN/WAN networking cost while meeting the performance requirements of the applications.”

Skeffington notes that the portion of a corporation’s data that needs to be client-facing is but a small percentage of the overall component. IT departments will need to classify their data and protect that data from a disaster, while at the same time raising the level of recovery on only the data that requires it, he says.

“A customer at a bank can live without an image of a check from three weeks ago but will demand a statement and access to their money,” says Skeffington. “These are two functions that are available to most clients today, but they are from completely separate systems in the background.”

For more storage features, visit Enterprise Storage Forum Special Reports

Get the Free Newsletter!

Subscribe to our newsletter.

Subscribe to Daily Tech Insider for top news, trends & analysis

News Around the Web