Posted on Leave a comment

Download repos with ssh keys on Gitlab with MFA enabled

When you enable MFA in Gitlab you may face issues when interacting with git repositories. Some of your commands like git pull, push etc could fail and this is done because of the MFA.

There is a way to resolve those issues by communicating with ssh keys. The procedure to create and upload your keys are described in the below article.

https://docs.gitlab.com/ee/user/profile/account/two_factor_authentication.html

However sometimes this may not work as in my case. The issue was that the ssh key for some reason could not be found correctly from the computer.

In order to bypass I used the ssh-add command and then pointed the directory of the key.

ssh-add ~/.ssh/gitlab_id_rsa

After this action your gitlab interaction will start working.

Posted on Leave a comment

Deploy resources on aws using terraform

Terraform is the most popular IAC tool among developers and devops engineers created by hashicorp. Anyone can use it freely to create multiple deployment environments and make the deployment procedure faster. In this article we will examine how we can use terraform AWS provider to deploy resources on AWS cloud.

The terraform AWS provider documentation can be found in the below link.

hashicorp/aws | Terraform Registry

The first need we will need to do is to create a user in AWS from IAM in order to create an access token for the deployment. By navigating in the IAM tab you can go and create a new user for terraform. I gave this user the name terraform

Then by pressing the user you can go in the security credentials tab and create a new access key. Those will be needed in the terraform script later on.

When creating the user you must specify the permission policies to attach. This will allow the necessary actions on the infrastructure. As in the provided terraform script below I only create a new vpc I should use the least privilege principal and only provide the permissions that are required. As a result I do not provide administrator access but only AmazonVPCFullAccess for this user. This build in policy rule allow full access on VPCs like creating, updating, deleting etc.

After those steps I will need to run my terraform script to create the resources I need. First you will need to initialize the terraform so that it downloads the providers stated in the files

terraform init

and the second step would be to apply the configuration

terraform apply

when you apply the configuration you will view what is created or deleted

Code:

terraform {
  required_providers {
    aws = {
      source  = "hashicorp/aws"
      version = "~> 5.0"
    }
  }

  required_version = ">= 1.2.0"
}

provider "aws" {
  region     = "eu-west-1"
  access_key = "KEY" // access key you generated for the user
  secret_key = "SECRET" // secret of the key
}

resource "aws_vpc" "vpc-test" {
  cidr_block = "10.10.0.0/16"


  tags = {
    Name = "ExampleAppServerInstance"
  }
}

When the deployment finishes you can find your vpc in your account.

Build infrastructure | Terraform | HashiCorp Developer

Youtube video:

Posted on Leave a comment

Publish coverage report on SonarQube for dotnet test

When you create coverage reports for your .NET projects you have the ability to use the native logger mechanism in order to export a trx report with your results.

An example of this native functionality can be found below as the logger parameter is used along with dotnet test.

dotnet test keyvault.sln  --logger "trx;logfilename=mytests.trx"

However if you try to upload this trx file in a sonarqube in order to get your coverage report result, this will fail with the below error message.

Error message:

WARN: Could not import coverage report ‘.\Keyvault_MI_Pod\Tests\TestProject1\TestResults\mytests.trx’ because ‘Only dotCover HTML reports which start with “” are supported.’. Troubleshooting guide: https://community.sonarsource.com/t/37151

Using the below sonarqube documentation page you will find all the available methods that are supported for the coverage reporting. When you need to upload your coverage results in sonarqube you will need to use a tool from the list and upload the file that this tool generates.

.NET test coverage (sonarsource.com)

In my case I will use dotCover and create the relevant report file.

In the teamcity task (begin analysis) you will need to add in the additional parameters where you export your report.

Then in the pipeline you will need to add the dotnet test task as shown from the documentation.

dotnet dotcover test --dcReportType-HTML

In the finish task you will see the coverage report type is now compatible and can be parsed from sonarqube.

Finally you will have your coverage report inside sonarqube.

Posted on Leave a comment

Access Managed Identity from container inside VM – Azure

Managed identity is the best practice regarding security when accessing resources on Azure. There are many ways you can use it for service to service communication. Sometimes though you can use nested managed identity in more complex scenarios like the one demonstrated below. In this guide we will enable managed identity on a virtual machine and we will access this managed identity within a container that runs on that specific virtual machine. This case can be useful in complex deployment scenarios where you have multiple containers inside a virtual machine and you want to deploy using managed identity on azure.

The first thing you will need is the system assigned managed identity on the virtual machine.

Then you can run your containers inside the virtual machine. In my case the containers are windows based as a result I will use the route print command to show the routing table.

Run the following Commands to expose the managed identity endpoint

$gateway = (Get-NetRoute | Where { $_.DestinationPrefix -eq '0.0.0.0/0' } | Sort-Object RouteMetric | Select NextHop).NextHop
$ifIndex = (Get-NetAdapter -InterfaceDescription "Hyper-V Virtual Ethernet*" | Sort-Object | Select ifIndex).ifIndex
New-NetRoute -DestinationPrefix 169.254.169.254/32 -InterfaceIndex $ifIndex -NextHop $gateway -PolicyStore ActiveStore # metadata API

After the successful add of the route the managed identity endpoint should be redirected in the gateway and from there you will be able to authenticate.

We can verify the procedure by executing a key vault managed identity secret retrieval.

$token = Invoke-WebRequest -Uri 'http://169.254.169.254/metadata/identity/oauth2/token?api-version=2018-02-01&resource=https%3A%2F%2Fvault.azure.net' -Headers @{Metadata="true"} -UseBasicParsing
$tokenvalue = ($token.Content | ConvertFrom-Json).access_token

Retrieve secret:

Invoke-WebRequest -Uri "https://test.vault.azure.net/secrets/testsecret/d9ce520dfdfdf4bdc9a41f5572069708c?api-version=7.3" -Headers @{Authorization = "Bearer $tokenvalue"} -UseBasicParsing

At last you can login using Managed Identity from the container using the powershell module.

References:

Co authored with Giannis Anastasiou @ Vivawallet