With growing data comes increased costs. Here are four recommendations from our Datacom cloud team to help you navigate and curb rising data costs.  

Know your data

The first step is to get visibility. Not all data is equal, so the better you know what you’ve stored and where, the more you can optimise and manage your costs. Without understanding what data you need to keep and why, it doesn’t matter how cost-effective your storage is – you’re likely storing (and paying for) data you don’t need.  

The next step is to determine what data is being accessed and analysed to establish policies around archiving. There are tools you can use to help you understand how your data is accessed and consumed, allowing you to plan efficient storage, platform and archive policies that work long-term. 

Protect your data – back it up

1. Get your backup and deduplication calculations right.

Even the smallest error here can blow out your costs – some tools can help you get calculations like rate of change and annual growth right.  

2. Back up your servers for recovery, not retention.

For long-term retention and compliance needs, consider an archiving solution that ensures an optimal balance between storage costs and recovery of data.  

 

3. Ensure copies are optimised and tiered (no data rehydration required).

Backup storage uses deduplication, so you don’t end up storing multiple copies of similar data. We also recommend using technologies and platforms that integrate so you only dedupe once through the data’s lifecycle. This can keep bandwidth utilisation and costs down.  

4. Understand where encryption is being performed.

Every time data is encrypted you lose the ability to remove any duplicated data. In our experience, the best way to ensure your backup volumes increase exponentially is to encrypt data before you backup. Of course, regulatory requirements may mean you need to encrypt your data, so make sure you have a plan to synergise your encryption and backups to avoid data blowouts.  

5. Choose a backup platform that allows you to change the underlying storage.

This will reduce costs as your data grows, without costly application migrations.  

6. Always have an offline copy (tape or vault).

Follow the best-practice 3-2-1 rule for full data protection and use storage that offers immutability to ensure your data can’t be compromised.  

7. Choose a backup platform where you pay for storage post deduplication.

This means you get the advantage of the deduplication savings and as your data grows, your storage continually gets more efficient.  

Be smart about meeting compliance requirements

Compliance often requires you to keep data for longer, so technologies that can retain it cost-efficiently are crucial. The key is to deeply understand your compliance position.  

“Keeping everything forever is not only expensive, but time will also make that data redundant,” says Datacom Senior Solution Architect Stuart Machin.  

“Getting clarity on your real compliance requirements and what you need to meet these will enable you to optimise your backups for recovery and ensure you’re just keeping what you need.”  

Different types of data will have different compliance requirements so solutions that integrate backup and archiving, like tagged datasets, will ensure a smooth data management plan across all your data. 

Get additional value from your data

Many organisations are seeing backup data as a rich data-mining opportunity, rather than just another cost of doing business. If you’re considering taking advantage of this, we recommend choosing a backup platform that uses disk-based storage so that your insights tools can access data as needed.  

This article was created in partnership with Veritas.

Related industries
Technology Education Energy & utilities Financial services Healthcare Public sector
Related solutions
Cloud services Data & analytics Platforms & applications Security