Regarding Gary Parker’s recent blog, Deduplication - The Power of Flexibility, Gary discusses the importance of data deduplication and the trade-offs among the various deduplication options that are available in the market.
An interesting point was the comment that “for the highest performance levels, a recommended best practice is to use flexible deduplication policies to leverage post-process deduplication for the initial backup (for speed), and then switch to inline deduplication for subsequent backups.” I would like to expand on that because it is an important element of a good deduplication implementation.
Data levels are doubling every 18 months. This rapid increase should come as no surprise when nearly everything we do has a digital element associated with it. When we text, we contribute one of 173 billion text messages sent every year. When we buy something, there is a digital transaction; Walmart alone posts 1 million customer transactions per hour. When we check up on our friends, we are one of 600 million Facebook users browsing through 40 billion photos. Apple iTunes recently delivered its 10 billionth download. Amazon now sells 180 Kindle books for every 100 hard covers.
All of this data must live somewhere, and the challenges of storing, managing and protecting all of it is spurring new approaches and architectures. Today, all of us at FalconStor find ourselves in the right time, in the right place and among the right people to create those approaches and make the most of an unprecedented market opportunity.