NEW YORK — Data de-duplication may not be widely adopted just yet, but if a few contented users are any indication, it may not be long until the technology becomes a staple in IT environments.
At a panel of dedupe users at the Storage Decisions conference here yesterday, three users — representing three different dedupe vendors — gave the technology high marks, and had few if any complaints.
While not many are using the technology, an informal poll at the start of the session by moderator Arun Taneja of the Taneja Group suggested that it could begin to catch on soon.
Only three of about 50 in the audience said they’re currently using the technology — but half the room raised their hands when asked if they were considering it. Including the panel, that made six users total out of a roomful of storage professionals.
Chances are those numbers may go up after yesterday’s panel.
The first speaker, David Dunkers of architectural firm Skidmore, Owings and Merrill, said Data Comain’s (NASDAQ: DDUP) dedupe technology has given the firm a 31-1 de-duplication ratio, faster, more reliable restores, longer data retention times, and a shorter backup window.
“I’m able to sleep a lot better at night,” said Dunkers.
Gregg Paulk of the Anderson Center for Autism, a NEC Hydrastor user, claimed a 44-1 dedupe ratio. He said no backups have failed since the center deployed the technology, and storage costs have dropped to less than 60 cents a gigabyte while being easier to manage. Backups are now 90 percent faster, he said.
Chris Watkis of the Grey Healthcare Group, a FalconStor (NASDAQ: FALC) customer, said he had no confidence in data integrity or recoverability before turning to a FalconStor VTL.
“The results were overwhelming,” he said: a dedupe ratio of about 72-1 and a big savings on storage costs.
“There were no records telling me what was being used,” he said. Now he uses FalconStor’s reporting function to determine when files are ready to go offline.
Taneja cautioned not to draw too much from the numbers — results will vary depending on file type — and Dunkers and Paulk said it took a while to achieve their results. Panel members also said that which solution you need will vary; Dunkers needed a NAS device, while Paulk and Watkis were looking at backup and archiving.
Watkis also stressed the importance of a good vendor relationship when choosing a new technology.
Watkis was the only one of the three to go through an upgrade cycle, and he said the firmware and software upgrade created only one minor problem, changing retention cycles. “I was very happy at the end of it,” he said.
Making Disk Backup a Reality
Dedupe was the subject of — or came up in — other sessions at the conference.
Curtis Preston, vice president of GlassHouse Technologies’ data protection services, said dedupe is making disk backup a reality.
“Disk backup requires dedupe to make it affordable,” he said.
But Preston echoed Taneja’s assertion that users need to find the right solution for them. “You must test it with your data,” he said.
He went over a number of options users need to consider, such as speed, local versus global data de-duplication, post-processing versus inline dedupe, and hashing versus delta differentials. Some data, such as seismic and medical imaging, aren’t well suited to dedupe, and some solutions can’t dedupe compressed data or multiplexed backups, he said.
In a session on virtualization, Stephen Foskett, director of Contoural’s data practice, said he didn’t think primary data de-duplication technology was ready for prime time just yet.
Only one in an audience of about 100 people said he was using primary data dedupe from NetApp (NASDAQ: NTAP) — and he said, “I love it.”
This article is courtesy of Enterprise Storage Forum.