Protecting enterprise data today is more important than ever today. Why?
For one, data is driving the modern digital transformation of enterprises. Second, it’s become clear that an organization’s second most critical asset—after its people—is its data. Third, enterprises regardless of industry or vertical are mining all of this data with analytics for new revenue streams.
So where does backup rank in all of this? It should rank pretty high but often it doesn’t.
Let’s look at a few numbers. The amount of data stored by businesses increases by 60 percent each year. The digital universe more than doubles every two years and will balloon 10-fold to 44 trillion gigabytes in 2020 from 4.4 trillion gigabytes in 2013.
This is an astounding amount of data that must be backed up and protected in case of hardware/software failure, malicious actors, human error, or natural disasters and the like. And 45 percent of overall storage capacity is dedicated to backup and archiving data. Additionally, 82 percent of companies have at least 10 copies of their data backed up or archived.
Data protection is a top IT priority, but investments rarely align with intent
Given the critical importance of data in our hyper-connected age, it stands to reason that your data protection processes and technologies are keeping pace. But in fact, even while the importance of data has grown exponentially for many companies, their approaches to backup and archive have not. It’s been sort of a “set it and forget it” approach, which is fine so long as it works. The problem is, as the three V’s of data—volume, variety and velocity—continue to expand in an enterprise, traditional backup and archive methods cease to work. And if they do work, they’ve become cripplingly expensive.
More often than not, the trigger for a backup/archive technology reassessment is spurred by the high expense of maintaining and storing all of this backup data. These costs have, over time, grown into a massive line item in the budget. So there’s an inherent tension between the need to capture, store and mine all of this data and the often-outdated processes and ever-growing expenses of backing up and archiving this data.
In fact, traditional backup is, from what Hedvig sees among our customers, still a top-three IT priority, yet time, effort, and investment fall in importance against all the other critical tools and processes that comprise the modern data center. Part of this is also due to the fact that many modern storage technologies have data protections built in, and, while they’re not strictly backup, they are pretty good methods of safeguarding data.
As such, in today’s modern age, there are in fact two distinct ways to think about backup. First, you can do it the way you’ve always done but employ a more modern, software-defined approach that yields many of the benefits mentioned above. Or, second, if you’re a progressive, bleeding-edge IT shop, you can reduce your reliance on backup by using more of the data protection mechanisms built into many modern infrastructure tools and applications.
Yet even in this second case you wouldn’t want to eliminate backup entirely. Backup is analogous to mainframes—they’re never really going away. It is still the last resort and can be a business lifesaver in the event of a ransomware attack or a power outage. A number of companies have been able to avoid expensive extortion attempts because they had backups and could restore to a previously known state.
Given the importance and permanence of backup, let’s dig into some approaches that can help. In particular, we’ll discuss how software-defined storage can lower your traditional backup storage appliance costs by 60 to 80 percent.
Software-defined storage reinvents backup storage
New software-defined approaches, particularly as they relate to storage, can help. Put simply, a software-defined approach to backup storage can provide the flexibility and scalability that legacy arrays can’t. It’s as easy as pointing existing backup software to a scale-out, software-defined storage target. Without any reengineering of your backup process, enterprises can benefit from significant reductions in capital and operating expenses. You have an elastic cluster that grows (or contracts if you need) as your data volumes change.
Because software-defined technologies are powered by commodity servers, you simply need to roll in new nodes or racks of nodes to power the backup cluster. You’ll never again need to face a forklift upgrade of your backup storage. Backup now becomes an area where organizations can blend existing investments with newer ways of managing systems and data.
With this modern approach, you could use a software-defined storage solution solely for backup and gain any number of advantages: customize storage to fit your service levels, protect data across sites and clouds, create point-in-time snapshots and clones, scale seamlessly with an elastic cluster, and integrate with pretty much any existing backup and archiving application. And all this with a solution that can cost 60 to 80 percent less than traditional backup storage appliances.
Given the vast amount of data now captured, mined and analyzed for business insight, coupled with the relentless threat of cyber ransom, enterprises need a new approach to backup—whether that’s using software-defined storage only in backup or depending mostly on the data protection mechanisms in modern IT infrastructure.
Software-defined storage: Buy it for the scalability, love it for the deduplication savings
Secondary storage, including backup, can no longer afford to remain the siloed environment it has been for decades. Take, for example, typical deduplication backup appliances. Today you purchase such an appliance and, over time, you fill it up. You get great dedupe on that scale-up set of disk shelves. But when you go to buy and deploy a second shelf, you receive none of the deduplication benefits you saw in the first.
The siloed nature of traditional backup means you have to start over again. With sophisticated software-defined storage technologies, this isn’t the case. It’s possible to achieve inline global deduplication that spans a single, distributed cluster. Your dedupe rates can actually get better as datasets grow.
It’s time for software to eat backup storage
The bottom line, as we all know, is that things will always go wrong. That’s the guiding premise. But are you architecting your systems from the get-go to survive that failure? And if you’re not architecting your systems for failure, well, in that case you absolutely need backup because it’s the first (or last, depending on how you look at it) line of data-protection defense.
Data in the modern enterprise continues to explode, software continues to eat the world, and backup is no exception. Thankfully, software technology that provides a flexible, scalable, efficient and cost-effective approach to data backup is now mature and robust enough to protect enterprises’ mushrooming amount of stored information.
Rob Whiteley, VP Marketing, Hedvig
Image Credit: Wright Studio / Shutterstock
No comments:
Post a Comment