Preparing a proper database backup plan strategy is very critical for every DBA to make sure the plan meets the business requirements such as Recovery Point Objective (RPO) and Recovery Time Objective (RTO). I have consulted many organizations regarding SQL Server issues and one of the most common mistake that client makes, is performing database backup (either data or log), blindly without checking the amount of data changes. Most of the database backup plans are highly depends on the schedule given by DBA.
The recent client mentioned that their backup policy is as the following: –
- Log Backup Every One Hour
- Differential Backup Once a Day
- Full Backup Once a Week
I do agree that above policy is fixed and it is not applicable on every type of databases, some databases are having transactional peak hours.
Let’s get back to our main topic ‘Smart Backup Plan’, RTO and RPO is not highly depends on the schedule of backup operation. If you take Log backup every one hour does not mean you can recover the database up to one hour before failure time, or taking 1TB database for changes of less than 2GB in a week is not realistic and you will waste off the storage capacity. Have a look at the following figure:-
As you see, the database full backup size is about 111GB, and this client takes a full backup every day for 30-100MB data changes daily.
The following script is taking the log backup if the log been used more than equal 200MB, by using this technique the number of log backups reduced and it decrease the restoring database period.
Declare @LogSize as float; Declare @LogSpace Table ([Database Name] sysname,[Log Size (MB)] float,[Log Space Used (%)] float, [Status] int); Insert Into @LogSpace Exec ('dbcc sqlperf (LOGSPACE)'); Select @LogSize = [log size (MB)] * ([log space used (%)]/100.0) From @LogSpace where [Database Name] = 'SmartBackup' If(@LogSize >= 200) Backup Log SmartBackup to Disk = 'NUL'; --Perform some modifications as per your business requirements.
The following script is to keep track of data pages modification on the server to feed ‘Smart Backup Plan’ solution:-
--Create the following table in a specific database. Create Table DB_Modified_Data (Database_id int , [Modified Data MB] float,LastUpdate datetime Default Sysdatetime()); --Run the following script in a Job with Schedule of 10-15 Seconds With BufferData As ( Select database_id,(count(is_modified) * 8.0)/1024.0 AS [Modified Data MB] from sys.dm_os_buffer_descriptors where is_modified = 1 Group By Database_id ) Merge DB_Modified_Data As Target Using (Select * From BufferData) As Source On (Target.Database_ID = Source.Database_ID) When Matched Then Update Set Target.[Modified Data MB] += Source.[Modified Data MB], Target.[LastUpdate] = SysdateTime() When Not Matched By Target Then Insert (Database_ID, [Modified Data MB]) values (Source.Database_ID,Source.[Modified Data MB]); Exec sp_MSforeachdb 'CheckPoint;';
The following script is to take full backup once the data page modification exceeds certain threshold:-
Declare @Size float; Select @Size = [Modified Data MB] from DB_Modified_Data where Database_id = DB_ID('SmartBackup'); If(@Size >= 1024.0) Begin Backup Database SmartBackup To Disk = 'NUL'; --Needs to perform some modifications as per your business requirements Update DB_Modified_Data Set [Modified Data MB] = 0 , LastUpdate = Sysdatetime() Where Database_id = DB_ID('SmartBackup'); End Else Begin Print Cast(@Size as char); End
As mentioned earlier, RTO and RPO is not depending on the backup scheduling. Perform smart backup to achieve the following objectives: –
- Takes Log Backup When Log Data is more than 200MB.
- Reduces I/O and CPU Consumption (For Log and Data Backup).
- Reduces the Storage Used Space (For Log and Data Backup).
- Count of Log Backup Sets will Reduce and Increases Restore Performance.