Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Data Domain battles duplication

Jared Heng | July 18, 2008
While not yet widely adopted by enterprises, data de-duplication technologies may gain greater attention as user understanding increases.

SINGAPORE, 15 July 2008 With enterprises facing huge data growth, compliance issues, economic pressures and tape technology challenges, data de-duplication (also commonly known as de-dupe) has become more essential than ever.

De-dupe storage solutions provider Data Domain has launched its latest solution in the Asia-Pacific. Called DD690, the appliance delivers an aggregate throughput of up to 1.4 Terabytes an hour, and single-stream throughput of up to 600 Gigabytes an hour, according to the company's officials. They have also said that it is primed to take advantage of enterprise trends toward de-duplication in storage operations.

De-duplication is the elimination of multiple copies of redundant data, said Vincent Loh, Data Domain's regional sales manager, Southeast Asia. While de-dupe technology is not yet widely adopted in the Asia-Pacific, enterprises in the region are starting to understand its importance as a cost-saving tool. He adds that consequently Data Domain seeks to educate Asia-Pacific enterprises about how de-dupe technology may contribute to business value.

According to Gartner, less than 5 per cent of backups today use de-dupe technology. The research firm, however, expects such technology to be applied to 75 per cent of backups by 2012.

Protecting data integrity

A replicator feature in DD690 can automate WAN vaulting for use in disaster recovery, remote office backup, or multi-site tape consolidation, according to sources at Data Domain. Vaulting is the process of sending data off-site to be protected against security threats.

Data Domain representatives also claimed that the solution can de-duplicate globally across remote sites, minimising the required bandwidth for data transfer. A graphical user interface is available to monitor data replication. Additionally, the solution further ensures data integrity by providing continuous verification during storage and recovery, they said.

According to Loh, a good de-duplication solution should easily integrate with existing infrastructure and enterprise applications. It should also allow more backups and archived data to be stored, while requiring less power consumption. The ability to lower WAN cost is essential as well, he said.

We have seen an increasing number of enterprises opting for capacity optimisation storage systems, says Chan Chee Keong, senior area director, Asia sales. It is difficult to measure adoption rates at this point, but growth in this segment has been very encouraging.

 

Sign up for Computerworld eNewsletters.