Posted on 4 Comments

Pass parameters from Power automate to Azure DevOps pipeline using rest api

Recently I had to implement the scenario that is depicted below.

In more detail I had to implement a way to get user input (usernames) in order to pass this information on an Azure DevOps pipeline and through this pipeline make some actions on Azure through az cli.

For the described solution I used the below services:

  • Azure Devops
  • Power Automate
  • Azure DevOps rest API
  • Azure

The first thing that I created was the form. In this form the user has to provide the input of the usernames in a requested format in order to pass this information on the later components.

Then I created a new power automate flow that would handle the input of this form and make a POST request on Azure DevOps api in order to trigger a build pipeline with the parameters of the form as input.

The flow and the task that have been used are depicted below.

Select response ID on the form.

On the POST request you should enter your details regarding the pipeline ID, organization and project. The body of the request should be as shown in order to get the parameters parsed correctly.

The azure devops pipeline will have as an input parameter and empty object.

trigger: none
pr: none 
pool:
  vmImage: windows-latest

parameters:
  - name: users
    type: object
    default: []

jobs:
  - job: vdi
    displayName: rest api pipeline
    steps:
      
    - ${{ each user in parameters.users }}:
      - task: PowerShell@2
        inputs:
          targetType: 'inline'
          script: |        
            Write-Host "${{user}}"
          

When user submits the form

then the power app will run

and as a result the azure devops pipeline will be triggered through the rest api.

Finally the pipeline will parse the parameters provided by the form.

Posted on Leave a comment

Test your backup mechanism – Automated restore for MS SQL using Azure DevOps

Most organizations rely on their backup solutions for application faults or data corruptions. However the backup is not frequently tested in order to verify that restore would be successful. In this post I implement a backup testing mechanism.

Lets examine a scenario for an MS SQL database server. The server will output a backup file (.bak) on a storage account based on a retention policy. This backup will be automatically restored on a SQL server through a pipeline and a result will be written as an output. The result can be then reported on the monitoring solution.

The flow is depicted below. An azure devops agent should be installed on the server on which the database will be restored. The pipeline will fetch the backup file from the storage account and store it on a data disk (in my case R:\files). Then sqlcmd command will be used to restore the .bak file and record the result. The backup file is provided by a parameter on the pipeline. Also a service connection should be created with your subscription on which the storage account is located.

Pipeline code:

trigger: none
pr: none
pool:
vmImage: windows-latest
parameters:
– name: backupfile
type: string
jobs:
– job: download
displayName: Download DB backup file
steps:
– task: AzureCLI@2
displayName: az cli download backup file from storage account
inputs:
azureSubscription: 'Azure-Service-Connection'
scriptType: 'ps'
scriptLocation: 'inlineScript'
inlineScript: |
$container_name_input = "container_name"
$saccount_name = "storage_account_name"
#$json = az storage blob list –container-name $container_name_input –account-name $saccount_name
az storage blob download –file "R:\files\${{parameters.backupfile}}" –name "${{parameters.backupfile}}" –container-name $container_name_input –account-name $saccount_name –auth-mode login
– job: restore
displayName: Restore SQL backup
dependsOn: download
steps:
– task: PowerShell@2
displayName: sqlcmd restore backup
inputs:
targetType: 'inline'
script: |
sqlcmd -q "RESTORE DATABASE [Database_Name] FROM DISK=N'R:\files\${{parameters.backupfile}}' WITH REPLACE,RECOVERY" -o R:\files\result.txt;
[string]$result = Get-Content R:\files\result.txt
if ($result.contains('successfully')) {
Write-Host "Restore was succesfull…"
}
elseif ($result.contains('terminating')) {
Write-Host "Terminating…"
}

Executing pipeline:

Result:

Important:

Azure DevOps agent service is configured to run with a specific account (in my case NT/ Local System). This account should have the appropriate permissions on the SQL server for the restore procedure. The easier way would be to make this account a database sysadmin.

Adding the NT Authority\System on SQL server sysadmins

Posted on Leave a comment

Pause automatically Dedicated SQL pool (formerly SQL DW) – REST api

I was struggling to make a dedicated sql pool pause automatically through a pipeline on non working hours. Although there are some options that you can use with az cli, it seemed that they could not work with an error that indicated resource could not be found.

az synapse sql pool | Microsoft Docs

Searching further I found that this action can be performed through the azure management API an example of which is shown below.

https://management.azure.com/subscriptions/a23eef11-f200-4722-866c-248ca45142f6/resourceGroups/sql-pool/providers/Microsoft.Sql/servers/geralexgr-sql/databases/geralexgr-sql/pause?api-version=2020-11-01-preview

In order to stop a dedicated sql pool (formerly SQL DW) you can follow the below guide that uses az cli to get an access token.

az account get-access-token

Add this token on Authorization tab with Bearer type and make your request.

By checking the result you can see that pause is completed sucesfully.

You can also automate the procedure using an Azure DevOps pipeline.

– task: AzureCLI@2
displayName: az cli stop command
inputs:
azureSubscription: 'YOURSUB'
scriptType: 'pscore'
scriptLocation: 'inlineScript'
inlineScript: |
$token = az account get-access-token | ConvertFrom-Json
$mytoken = $token.accesstoken
$headers = @{
Authorization = "Bearer $mytoken"
}
Invoke-RestMethod -Method Post -Uri "https://management.azure.com/subscriptions/xxxxx-xxxxx-xxxxx/resourceGroups/resource-group/providers/Microsoft.Sql/servers/sql-server/databases/database/pause?api-version=2020-11-01-preview" -Headers $headers -UseBasicParsing -ContentType 'application/json'

https://docs.microsoft.com/en-us/rest/api/sql/2021-02-01-preview/servers

Posted on Leave a comment

Azure DevOps best practices – Organization security settings

Security is a vital component of every Organization. As such DevOps components and environments should also be protected and created based on best practices. Below are some important settings that you should consider enabling/disabling for your Azure DevOps Organization. All these can be found on the left bottom corner of your organization page.

  • Disallow external users joining your organization. In order to protect your projects, pipelines and source code you should disable external user access. This setting combined with an Azure directory integration, should allow only verified users of your company.
  • Disable public projects. Sometimes a project may be created public by mistake and expose a threat for your source code. With the below setting you can disable public projects creation.
  • Enable Log Audit Events. For security purposes you should log Pipeline and user actions. Going into security policies you can enable log Audit events. Then a new pane will be created named Auditing. Through auditing you will have a view on what is happening in your projects.
  • Don’t add users on the Project Collection Administrators group on your organization. This group will provide global admin permissions for your organization and allow changing settings on it. Make sure you have you have only the necessary personnel on this group.

Disable Allow team and project administrators to invite new users. You can disable project administrators from inviting new users. This should prohibit unnecessary invitations on your projects as only Organization administrator will be able to invite new users.

  • Install only trusted extensions inside your organization. Extensions can help you implement more complex functionality on your projects using tasks that have been created from community/companies. However one should be careful for the installed extensions and only allow verified publishers that their code will not harm your organization.
  • Enforce a maximum allowed lifespan for new PAT tokens and also restrict full scoped PAT. When your organization is connected with your company AD you can restrict the full scoped PAT. As full access for a PAT can interact with many aspects of the organization it would be a best practice only to enable the service that is needed. Also a maximum lifespan of the tokens can be enforced so that compromised tokens will expire.

Although every practice you follow cannot protect you 100% the above settings are a good starting point for every Azure DevOps administrator.