Making a full backup every day can consume a lot of time, and get very expensive pretty quickly. In order to minimize potential backup times while also saving money on storage media, backing up incremental changes is the only smart way to protect your data.
Incremental backup starts with the assumption that only a small portion of your data will change on any given day. Typically, a company will only modify less than 5% of their business data on any given day. If you can isolate these changes into a backup, you could theoretically cut storage costs and backup time by 95%.
One of the longest-standing debates within the backup space has to do with deciding which incremental data capture methodology is best: Differential Incremental or Cumulative Incremental?
What exactly does this mean? Well, incremental data changes can be captured in one of two ways.
The first – and most efficient – technique would be to perform a single full backup, and then only copy the data which has changed on a daily basis. This is often called the Differential Incremental approach. The main advantage of this approach is that it keeps the amount of backup storage to an absolute minimum.
Unfortunately, differential incremental can also be a logistical nightmare. If you do a full backup on day one, and your server crashes on day 100, you’ll need to load the first full backup along with the following 100 incremental backup copies.
If any of those 100 incremental backups are corrupted, it could potentially cause major problems with the recovery. Also, this added complexity greatly increases the chance of possible human error during the recovery process.
Differential incremental backups also offer very slow recovery since you have to load a lot of redundant data during the recovery process. For example, you might load several hundred gigabytes worth of temporary files, only to delete them immediately after. Or you might need to load dozens of copies of a frequently accessed file when you only need to recover the most recent version.
There are 2 common ways to get around this problem.
One way is to perform full backups on a regular basis in order to cut down on the number of incremental backups which must be loaded in the event of a data disaster. Today, it’s common for companies to perform a full backup on the first of every month, followed by – at most – 30 daily differential incremental backups.
Another approach would be to consolidate all of the previous daily incremental backups into a single backup storage unit for easy recovery.
This is what the Cumulative Incremental approach tries to do. Every day, a backup is performed which only copies the data that’s changed since the original full backup was performed. If you need to recover your systems, you only need to load 2 sets of backup: the original full backup and the last cumulative incremental backup.
By simplifying backups in this manner, you can greatly speed up recovery while also reducing the potential for human error.
The downside of the cumulative incremental approach is the fact that the required backup storage can grow exponentially.
If your data changes only 10 GB per day, your incremental backups can grow to several terabytes pretty quickly. Because of this, a cumulative incremental backup cycle requires short cycles with frequent full backups in order to keep storage costs low.
Backup administrators have to fight a constant battle in order to minimize storage costs, backup windows, recovery speeds and handling complexity. As data growth continues to accelerate, it quickly becomes apparent that a new approach is needed.
This new approach must have the efficiency and speed of a differential incremental backup, the simplicity of a cumulative differential backup, and it should completely do away with the need for periodic full backups.
Thankfully, such a solution exists.
Within recent years, the Progressive backup paradigm has completely changed the way IT administrators protect their data. With progressive paradigm, you only need to perform a single full backup. From that point on, you will only perform daily differential incremental backups which will be sent to the backup server.
Once at the server, this incremental backup is combined with previously stored versions, and a new full backup is artificially recompiled. And no matter how many times you repeat this process, you’ll never have to perform another full backup again.
This is why the progressive paradigm is often referred to as “Incremental Forever”
If you ever need to recover your data, you simply load the appropriate version from the backup server as a single full backup. Not only does this speed up the recovery process, but it also greatly reduces the potential for human error.
If you’ve been struggling to maintain control over exponential growth within your differential or cumulative incremental daily backups, you may want to consider evaluating a managed backup service that offers a progressive “incremental forever” backup capability.
About The Author: Storagepipe Solutions is a leader in server online backup solutions, and offers many time-saving options for protecting businesses data.