Posted on Leave a comment

Azure DevOps Terraform Provider

If you work everywhere as a code you will probably need to check Azure DevOps terraform provider. It is created and maintained from Microsoft and you can use it in order to have your DevOps tool as a code.

In order to getting started you will need to create a PAT token and give it the access based on the actions that you need to do.

When the token is ready you will need to set two environmental variables on the machine that you work. The first one is AZDO_PERSONAL_ACCESS_TOKEN which should be your token. The second one will be your org URL AZDO_ORG_SERVICE_URL


Finally you are ready to deploy your IAC Azure DevOps configurations.

Lets see the below example.

# Make sure to set the following environment variables:
terraform {
  required_providers {
    azuredevops = {
      source = "microsoft/azuredevops"
      version = ">=0.1.0"

resource "azuredevops_project" "project" {
  name = "My Awesome Project"
  description  = "All of my awesomee things"

resource "azuredevops_git_repository" "repository" {
  project_id =
  name       = "My Awesome Repo"
  initialization {
    init_type = "Clean"

resource "azuredevops_build_definition" "build_definition" {
  project_id =
  name       = "My Awesome Build Pipeline"
  path       = "\\"

  repository {
    repo_type   = "TfsGit"
    repo_id     =
    branch_name = azuredevops_git_repository.repository.default_branch
    yml_path    = "azure-pipelines.yml"

When above code runs it will create a new project with the name My Awesome Project. Inside the project a new git repo will be initialized and a new pipeline will be created inside this repository.

You can find the usage example below.

Youtube video:

Posted on Leave a comment

Optimise Azure Costs with Advisor Recommendations

Microsoft Advisor can be a powerful tool for your Azure subscription because it automatically suggest you various changes in your infrastructure based on specific pillars like Cost, Security, Reliability, Performance etc. This means that you can easily eliminate costs by following some of the findings that are automatically generated.

A sample scenario is my virtual machine that I created some time ago with SKU E2as_v5. I selected the specific type because the application that I had to install inside the server was requesting such specs but this is not the case at all. As I was looking at the recommendations, I saw a note to downscale my virtual server as it seems under utilized.

Specifically it was mentioned to downscale for D2as_v5 SKU which used 8GB RAM instead of 16GB it is cheaper and it is a general purpose VM and not Memory Optimized.

In order to downscale one should go in the virtual machine and under size select the new SKU and press resize.

But how can I be sure that the performance will be adequate and the virtual machine will perform as needed? By looking at the statistics below, I noticed that the CPU was steady and the available memory was constantly at 9GB. This means that it was heavily under utilized and the application although having minimum specs of 16GB in reality that was not accurate.

By monitoring the virtual machine after the change I can conclude that it is performing well and the original SKU was not calculated correctly.

Youtube video:

Posted on Leave a comment

Deploy azurerm function app with zip_deploy_file terraform

When you need to deploy code in a function app using terraform you can use the zip_deploy_file parameter. Using this you can specify a zip file that will be created from your committed code and by doing so you can dynamically deploy code using terraform.

The first thing that you will need to do is to create a folder with the code that you want to commit in the function_app. In my example I have the terraform files inside the aws and azure folders and in the same directory I have a folder called resources where the code is located. This code need to be deployed in the serverless function app.

Create a data archive_file with terraform and specify where your code is located. You should correctly point where the files are stored.

data "archive_file" "python_function_package" {  
  type = "zip"  
  source_file = "../resources/" 
  output_path = ""

Then you should use the above data archive and use it along with zip_deploy_file.

resource "azurerm_linux_function_app" "functionapp" {
  name                = var.serviceplan_name
  resource_group_name =
  location            = azurerm_resource_group.rg.location

  storage_account_name       =
  storage_account_access_key =
  service_plan_id            =
  zip_deploy_file = data.archive_file.python_function_package.output_path
  app_settings              = "${var.app_settings}"

  site_config {
    application_stack {
        python_version = "3.10"


When you deploy your terraform code the function app will correctly upload the code in your infra component and you can check that by navigating inside the code on azure portal.

azurerm_linux_function_app | Resources | hashicorp/azurerm | Terraform | Terraform Registry