Getting Your SQL Server Backup Footprint

Recently, I needed to be able to determine the total size for all the most recent full backups for certain servers. Luckily for me these particular servers were using Ola Hallengren’s Maintenance Solution which simplified things since each back up had it’s own specific directory. Powershell has the flexibility to easily provide what I need with little effort.

Let’s take a look!!

First, I usually prefer to have production backups stored separately from non-production backups however in this case this type of structure wasn’t feasible.  All of the servers backed up to the same location, including DR servers. So, I needed a way to specify which exact directories I wanted to look through for the FULL backup files since they task was to only provide this information for production databases  In order to facilitate this, I used a simple array of server names.  Remember that with Ola’s solution each server will have its own parent folder effectively segregating the folders that I want away from the folders I don’t need.

#sum of most recent full backup files
$srvs = @("server1","server2","server3")

Now that I know which folders I’m going to go look through, I can use a ForEach loop to iterate through the array.  Essentially, I want Powershell to do something for each server listed within the server array.

Next, I need to build out the path to go look for.

$path = "D:\backups\$($srv)"

Once the path is built, I can use the Get-ChildItem (or aliased “gci”) to recurse through the path variable created above.  The Get-ChildItem cmdlet is then piped into the Where-Object cmdlet where I can specify search filters, in this case, the file extension of “.bak” and a last write time that is greater than 1 day ago.  The resulting object set of files is placed into the $x variable.

$x = gci -path $path -Recurse | where-object {($_.extension -eq ".bak") -and ($_.lastWriteTime -gt (get-date).AddDays(-1))}

Keep in mind that the $x variable only holds a list of files for which ever server the foreach loop is currently processing.  Remember that we are processing the full backup files for each server individually.   Since $x is an object in Powershell, we can use the Measure-Object cmdlet to sum up the length of all files.  This will give us the total size of all of the backup files that were created within the past day for that given server.

[long]$amt +=($x | measure-object -sum length).sum | out-string

We also want to be able to retain each total for each iteration and add to the previous amount.  We do this by declaring another variable, $amt, and using PowerShell’s additive operator which will add the sum value to whatever value $amt currently is.  In SQL Server T-SQL language, this is “SET @X = @X + @sum”.  Note that we also declare the $amt variable as a “long” since the value from the sum is in bytes and that will most likely be a large number.

Here is the entire script.  You can also find modifications on my GitHub account.

#sum of most recent full backup files
$srvs = @("server1","server2","server3")
foreach ($srv in $srvs){

$x = gci -path $path -Recurse | where-object {($_.extension -eq ".bak") -and ($_.lastWriteTime -gt (get-date).AddDays(-1))}
[long]$amt +=($x | measure-object -sum length).sum | out-string
}
$amt

Summary

There are many ways to skin the perveribal cat but this is just a quick and easy way to determine how big all of your backup files might be.  You can find this information out via T-SQL as well, however this method allows you to quickly process multiple servers, assuming of course all of the backup files are in the same location.  Of course, there very well could be a more efficient way to accomplish this.  I am not a Powershell expert by any means so if you have you done something similar please share your story with me!

Keep in mind that this solution was specifically geared around utilizing Ola Hallengren’s Maintenance Solution.  If you are using a different file structure for your backup files, you will need to adjust the script accordingly.

Got a better way?  Should I have done something differently?  Shout it out!

 

 

© 2019, John Morehouse. All rights reserved.

Share

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Trust DCAC with your data

Your data systems may be treading water today, but are they prepared for the next phase of your business growth?