Posted on Leave a comment

Restore transaction log backup on MS SQL server using powershell

Recently working on a pipeline, I wanted to restore multiple transaction log backups on a MS SQL server. The files were located under R:\files\incremental.

You can use below powershell in order to do the restore automatically.

$folder_for_cleanup = "R:\files\incremental"
Get-ChildItem $folder_for_cleanup | Sort -Property FullName | ForEach-Object {
Write-Host restoring $_.FullName
sqlcmd -Q "RESTORE LOG [Database_Name] FROM DISK=N'$_' WITH NORECOVERY" -o R:\files\results\incrementalresult.txt; #writing a log output
[string]$result = Get-Content R:\files\results\incrementalresult.txt
if ($result.contains('terminates')) {
Write-Host backup is already present in the database, skipping …
Remove-Item $_.FullName -Force -Confirm:$false
}
else {
Write-Host sucessfully restored $_.FullName
Remove-Item $_.FullName -Force -Confirm:$false
}
}

The above powershell will try to restore the X transaction log in the database. If the one is already restored, it will skip, else the log will be restored.

Posted on Leave a comment

Test your backup mechanism – Automated restore for MS SQL using Azure DevOps

Most organizations rely on their backup solutions for application faults or data corruptions. However the backup is not frequently tested in order to verify that restore would be successful. In this post I implement a backup testing mechanism.

Lets examine a scenario for an MS SQL database server. The server will output a backup file (.bak) on a storage account based on a retention policy. This backup will be automatically restored on a SQL server through a pipeline and a result will be written as an output. The result can be then reported on the monitoring solution.

The flow is depicted below. An azure devops agent should be installed on the server on which the database will be restored. The pipeline will fetch the backup file from the storage account and store it on a data disk (in my case R:\files). Then sqlcmd command will be used to restore the .bak file and record the result. The backup file is provided by a parameter on the pipeline. Also a service connection should be created with your subscription on which the storage account is located.

Pipeline code:

trigger: none
pr: none
pool:
vmImage: windows-latest
parameters:
– name: backupfile
type: string
jobs:
– job: download
displayName: Download DB backup file
steps:
– task: AzureCLI@2
displayName: az cli download backup file from storage account
inputs:
azureSubscription: 'Azure-Service-Connection'
scriptType: 'ps'
scriptLocation: 'inlineScript'
inlineScript: |
$container_name_input = "container_name"
$saccount_name = "storage_account_name"
#$json = az storage blob list –container-name $container_name_input –account-name $saccount_name
az storage blob download –file "R:\files\${{parameters.backupfile}}" –name "${{parameters.backupfile}}" –container-name $container_name_input –account-name $saccount_name –auth-mode login
– job: restore
displayName: Restore SQL backup
dependsOn: download
steps:
– task: PowerShell@2
displayName: sqlcmd restore backup
inputs:
targetType: 'inline'
script: |
sqlcmd -q "RESTORE DATABASE [Database_Name] FROM DISK=N'R:\files\${{parameters.backupfile}}' WITH REPLACE,RECOVERY" -o R:\files\result.txt;
[string]$result = Get-Content R:\files\result.txt
if ($result.contains('successfully')) {
Write-Host "Restore was succesfull…"
}
elseif ($result.contains('terminating')) {
Write-Host "Terminating…"
}

Executing pipeline:

Result:

Important:

Azure DevOps agent service is configured to run with a specific account (in my case NT/ Local System). This account should have the appropriate permissions on the SQL server for the restore procedure. The easier way would be to make this account a database sysadmin.

Adding the NT Authority\System on SQL server sysadmins