Backup Best Practices for MSPs
In this article, we offer tips and backup best practices to help MSPs plan a backup strategy, decide where to store backup data and how to tailor backup plans to different types of systems.
Backup Strategy Planning
In order to plan a backup strategy, you first need to categorize the various kinds of data that you will back up, and decide how to prioritize it.
In most cases, the data that you need to back up can be divided into three main categories:
- System data consists of operating system files. Typically, system data is not altered frequently once the OS is set up and configured.
The main concern with system data is to perform a full backup after system deployment, and then afterward update it only with the data that has been altered since the last backup.
- Application configuration data consists of the configuration files for applications. Like system data, application configuration data is rarely modified after installation, and backup considerations are thus the same.
In most cases, you can easily reinstall an application, so you need to worry about backing up only its configuration data, rather than the application itself.
- Operational data consists of user documents and files, databases, mailboxes, etc. This data is critical for business, so it should be highly available and treated separately from other data classes.
You will need to recover OS and application before operational data can be restored and made available to users, so you can’t back up just operational data.
You should also mind another classification for data. Based on your client’s business requirements and restore scenarios, define how granular the restore timing should be for different data. There are three main tiers, each of which reflects a different restore time need:
- Hot - data for which you’ll need to have the most granular and frequent restore points. Examples include user data that changes daily or hourly, such as emails or database files.
- Cool - Data that changes on a semi-regular basis, such as documentation or website content.
- Cold/Archive - Data that is historical and stored primarily for compliance reasons, such as historical payment transaction logs.
Further reading Data Archiving 101: Methods and Storage to Use
Create a separate backup plan with a different schedule for each type of data. For instance, for hot data, your clients may be able to afford to lose one day’s worth of users’ work. In that case, the data should be backed up every day. But you might not need to back up system files and application configurations as often, so you would create a separate backup plan for that data -- for example, once per week. This way, your backups of the hot data will take less time and consume less storage than they would if you attempt to back up all of your data every day.
If you use the cloud for backup storage, use the cloud storage tier that corresponds to the data classifications above. Following these backup recommendations will help you control costs while guaranteeing the appropriate level of access to different types of backup data.
A full backup is a straight copy of an entire dataset. Thus, it is the most reliable type of backup. However, running a full backup every time you do a backup is inefficient, since full backups require more storage space and time to complete than other backup types.
In many cases, it makes sense to back up only data that has been created or altered since the most recent backup. That’s what an incremental backup does, uploading only the changed files to backup storage. Block-level backup uploads only the changed parts of a file, which makes it effective on large and constantly changing files.
Block-level backup in MSP360 Backup works efficiently for files larger than 1 Mb, where new data is added at the end of the file (log files, .pst etc.)
Differential backup uploads all files that have changed since the first full backup. Thus, this type of backup gets larger and larger with each backup, and can eventually outgrow the initial full backup.
The typical use case for differential backup is SQL Server backup. MSP360 backup solution also uses differential backup only for SQL Server. For other datasets, differential backup is usually not used.
Another backup type used for SQL Server is transaction log (or T-log) backup. It backs up the logs of data transactions that have occurred within a database. It is the most granular type of SQL Server backup and allows point-in-time restores, thereby providing maximum data protection. T-log backup is fast because only log files, and not the entire database, need to be backed up. However, it works only with databases (namely, Microsoft SQL Server) that support T-logs.
The synthetic full backup is an advanced efficient way to perform full backups without actually copying every piece of data in order to create a new full backup. The data in the backup storage is compared to the data on the machine, and only the data that has been changed since the last backup is copied from the machine to the backup storage. The rest of the information is replicated inside the storage to create a new full backup that reflects the latest state of the system you are backing up.
The synthetic full backup is most efficient for backing up large files, where traditional full backup would take too long. MSP360 provides synthetic full backup for performing the image-based backup.
For more detailed explanations and a simple chart to help you choose the backup type(s) that are best for your clients, refer to our comparison of backup types:
- Cross-platform backup and recovery
- Flexible licensing
- Bring-your-own storage approach
Although full backups require more time and space than the other backup techniques described above, remember to schedule full backups regularly, no matter how often you perform other types of backups. Not only are full backups the most reliable, but having an up-to-date full backup on hand leads to a faster restore process because full backups can be restored without having to perform the data reassembly required for restoring from other types of backups.
Just as there are multiple types of backups, there are multiple approaches to copying backup data.
File-level backup means making backup copies of individual files. You simply choose the files to back up, and then you copy them to a backup location. When you need to recover the files, you can also simply choose which ones to recover.
In most cases, it makes sense to use file-level backup for application configuration and operational data.
System state backup is a copy of crucial OS files that are necessary for the successful operation of a server, computer, or other systems. If you have a system state backup, you can recover your operating system in case it is damaged.
System state backup is best used to create a “snapshot” of a system prior to performing configuration changes. That way, you can revert to the snapshot if you need to revert the system to its previous state.
System image backup means creating an exact copy of all of your hard drive’s data. Some backup vendors including MSP360 and users usually call this technology image-based backup.
Use a backup agent to create a bootable USB or ISO file for the bare-metal recovery of your system and data. It can be helpful in cases of hardware or software failure, disaster recovery, or restoring after a malware attack.
For more detailed explanations of ways to back up Windows computers, including both desktops and servers, refer to our guide to full system backup and recovery:
Backup Destinations: Local, Offsite, and Mixed Approach
Local storage is any type of storage that exists in the same physical location as production servers. It could take the form of internal or external hard disks that are used to store copies of backup data. Or, it could be a NAS device that is set up in your client’s office or a server dedicated to backup storage.
Further reading Local Storage as a Backup Destination
Offsite storage is any storage that exists on a different site from production systems. An external hard drive that is brought home counts as offsite storage. But cloud storage is a more realistic and reliable solution for serious offsite storage. Cloud storage provides benefits such as easy scaling and no hardware purchase expenses, but it also has drawbacks like bandwidth limitations and dependency on the Internet.
If you are considering moving large amounts of data to the cloud, but are worried about bandwidth speeds, check out cloud storage providers with initial upload options like AWS Snowball Edge for Amazon S3 or Backblaze Fireball for BackBlaze B2.
Hybrid Storage Approach
You can get the best of all worlds by mixing local with offsite storage. This is an easy way to meet the requirements of the 3-2-1 backup rule, which says that you should:
- Create at least three copies of data.
- Keep these copies on two or more different media types.
- Keep one of the media in offsite data storage.
Running local and cloud backup routines at the same time place a heavy load on network bandwidth and system resources. To decrease the workload on a server or a user machine, look for a hybrid backup solution.
Backup Best Practices for Different IT Systems
Now, let’s review backup best practices for different components of an IT infrastructure.
File servers contain a lot of crucial data and are typically the biggest source of backup upload volume. But you can achieve optimal data transfer by following a few backup recommendations:
- Divide the server disk into different partitions: Assign 30 – 50 GB to the operating system, and then create a different partition for storing the data. Since system files mostly remain unaltered after the initial configuration, it’s a good idea to separate them from working data.
- Back up the system partition using image-level backup, which is convenient for recovery. The use of the Synthetic Full Backup feature will diminish the time required to perform this type of backup.
- For the data partition, use the file-level plan with the Block-Level Backup option enabled. This will help to skip unmodified files and accelerate the upload process. Make block-level updates every night.
Microsoft Exchange and SQL Server
- Develop a backup schedule based on RTO and RPO required by your client. The intervals between the backups depend on how much data loss the business can tolerate. You can perform a transaction log backup as often as every few minutes, which will allow you to perform a point-in-time restore and minimize data loss. For example, if losing more than 10 minutes of database changes is a problem, then make sure to schedule your transaction log backups at least every 10 minutes.
- Use the proper edition of MSP360 Backup (Exchange, SQL Server or Ultimate for both Exchange and SQL Server) to separate and manage database files. That will make the backup process easier.
- Make sure that you store the database backup in a different location from the production database.
Further reading Database Backup Best Practices for MSPs
Virtual Machine Backup
When the system you are backing up is a virtual machine, there are some special backup recommendations to keep in mind:
- Remember that VM snapshots are not the same thing as a backup. Don’t rely on snapshots alone to protect your clients’ data.
- Use Changed Block Tracking to back up VMware VMs more efficiently.
- Store application data on a separate virtual disk. That way, you can back up system files and application files separately.
Further reading VM Backup Best Practices for MSPs
User Data and General Tips
Usually, users’ work data is concentrated on desktop PCs and in app-related document folders. That’s why many IT specialists think that it is enough to back up these directories regularly and instruct users where to save the files. In practice, however, users don’t always stick to these rules, so you should follow additional backup best practices to protect user data.
- Some documents remain unchanged for a long time and then are suddenly updated with business-crucial information. If you use hybrid cloud backup and have a central backup server, enable the real-time backup feature in MSP360 Backup to update files right after modification. This will reduce RPO to zero. To prevent access errors, make sure you forbid MSP360 Backup from copying files opened in other apps.
- Specify appropriate retention settings depending on the type of data being backed up, the type of backup storage being used, and compliance regulations that might require data to be archived for a certain length of time. (Learn more in this article: Backup Retention and Scheduling Best Practices.)
- Use descriptive backup plan names so that you understand what exactly you are backing up. This makes it easy to recover data later.
Find more backup recommendations in our article about file-level backup:
Further reading File Backup Best Practices for MSPs
Depending on which types of data you are backing up, which systems you are protecting, and what your clients’ RPO and RTO needs are, the strategies you follow to perform backup can vary widely. So can the types of backup storage you use.
However, no matter what your backup routines look like, there are a number of backup best practices that apply to virtually all backup scenarios. with a handy summary of all backup best practices (at the end of the whitepaper) and for more details on system-specific backup best practices: