Posted on Leave a comment

Restore transaction log backup on MS SQL server using powershell

Recently working on a pipeline, I wanted to restore multiple transaction log backups on a MS SQL server. The files were located under R:\files\incremental.

You can use below powershell in order to do the restore automatically.

$folder_for_cleanup = "R:\files\incremental"
Get-ChildItem $folder_for_cleanup | Sort -Property FullName | ForEach-Object {
Write-Host restoring $_.FullName
sqlcmd -Q "RESTORE LOG [Database_Name] FROM DISK=N'$_' WITH NORECOVERY" -o R:\files\results\incrementalresult.txt; #writing a log output
[string]$result = Get-Content R:\files\results\incrementalresult.txt
if ($result.contains('terminates')) {
Write-Host backup is already present in the database, skipping …
Remove-Item $_.FullName -Force -Confirm:$false
}
else {
Write-Host sucessfully restored $_.FullName
Remove-Item $_.FullName -Force -Confirm:$false
}
}

The above powershell will try to restore the X transaction log in the database. If the one is already restored, it will skip, else the log will be restored.

Posted on Leave a comment

Build NodeJS applications using Azure devops and npm

In this article I will explain how you can build your nodeJS application on Azure Devops. For the purposes of this demo I have created a hello world javascript application and stored it under a folder called node inside my repository with the name project.

We will use the folder structure later on for the build tasks.

The node folder contains two files, index.js and package.json. 

The package.json file has been created automatically after initialization of a new project with

npm init

In order to build this project using npm you should define under the scripts section the commands that will be executed using a specific target. 

For example in order to compile application I will execute

npm run build

but the build instructions have to be configured under the package.json file. 

{
"name": "nodetest",
"version": "1.0.0",
"description": "Test project for AzureDevops using nodeJS",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1",
"dev": "node index.js",
"build": "node index.js"
},
"author": "",
"license": "ISC",
"dependencies": {
"express": "^4.18.1"
}
}

As shown above, the build and dev targets have a simple node command that will start the application.

The index.js file is a console output javascript command.

console.log("Hello from node js app");

In order to build this node app using devops I will use the build in Npm@1 task. First I will execute the install command in order to install dependencies. Then I will run npm run with the target build. This will execute the node index.js command and will execute the node application.

trigger:
- none
pool:
vmImage: ubuntu-latest
steps:
- task: Npm@1
displayName: npm install
inputs:
command: 'install'
workingDir: '$(Build.SourcesDirectory)/node/'
- task: Npm@1
displayName: npm run build
inputs:
command: 'custom'
workingDir: '$(Build.SourcesDirectory)/node/'
customCommand: 'run build'

After pipeline run the Hello world message will print on the output.

Find this guide on Youtube:

Posted on Leave a comment

Add log analytics workspace to Azure app service – Terraform

Most times you will need to store logs for your azure resources in order to troubleshoot when things do not work as expected. Diagnostic settings for an app service can be enabled from the pane under Monitoring.

Then you should configure the diagnostic settings that will point which logs should be forwarded.

You can choose from the available categories shown below.

Lets now discover how we can enable diagnostic settings for an app service using terraform.

Create a file for example diagnostic_settings.tf and apply. The below configuration will enable all diagnostic settings categories.

resource "azurerm_monitor_diagnostic_setting" "diag_settings" {
  name               = "diag-settings"
  target_resource_id = azurerm_windows_web_app.app_service1.id
  log_analytics_workspace_id = local.log_analytics_workspace_id
  
  log {
    category = "AppServiceHTTPLogs"
    enabled  = true

    retention_policy {
      enabled = false
    }
  }

    log {
    category = "AppServiceConsoleLogs"
    enabled  = true

    retention_policy {
      enabled = false
    }
  }

    log {
    category = "AppServiceAppLogs"
    enabled  = true

    retention_policy {
      enabled = false
    }
  }

    log {
    category = "AppServiceAuditLogs"
    enabled  = true

    retention_policy {
      enabled = false
    }
  }

    log {
    category = "AppServiceIPSecAuditLogs"
    enabled  = true

    retention_policy {
      enabled = false
    }
  }

     log {
    category = "AppServicePlatformLogs"
    enabled  = true

    retention_policy {
      enabled = false
    }
  }

  metric {
    category = "AllMetrics"

    retention_policy {
      enabled = false
      days = 30
    }
  }

}

You can also perform the same using a loop and a local variable in order to minimize code and make it more readable.

Assign a new variable inside your locals.tf file.

 log_analytics_log_categories     = ["AppServiceHTTPLogs", "AppServiceConsoleLogs","AppServiceAppLogs","AppServiceAuditLogs","AppServiceIPSecAuditLogs","AppServicePlatformLogs"]

Then perform terraform apply.

resource "azurerm_monitor_diagnostic_setting" "diag_settings" {
  name               = "diag-rule"
  target_resource_id = azurerm_windows_web_app.app_service1.id
  log_analytics_workspace_id = local.log_analytics_workspace_id
  
  dynamic "log" {
    iterator = entry
    for_each = local.log_analytics_log_categories
    content {
        category = entry.value
        enabled  = true

        retention_policy {
      enabled = false
        }
    }
   
  }

  metric {
    category = "AllMetrics"

    retention_policy {
      enabled = false
      days = 30
    }
  }

}

After applying terraform all the settings will be enabled.