Posted on Leave a comment

Install linux azure devops agent on docker container

As we previously examined how we can create a containerized azure devops agent running on a windows machine, we will now go through the same procedure but with linux OS.

You can read the windows container azure devops agent article using the below link:

The first thing that you will need is a virtual machine that runs docker. When this requirement is fulfilled you can jump on the image building. In order to build your image you will need your Dockerfile and the instructions for the agent.

You can read the rest of the article on Medium using the link below:

A detailed deployment video can be found on my Udemy course:

https://www.udemy.com/course/mastering-azure-devops-cicd-pipelines-with-yaml/

Posted on Leave a comment

Containerize a .NET app with Docker and vs code

When you build your application with cloud native technologies you will build microservices on containers instead of monolithic applications. We will now examine how easy is to build a .NET application in a container and run this application on your local machine.

First we will need to create the visual studio solution. I will go through that with visual studio IDE and then I will use vs code. For my microservice I am using a ASP .NET Core web api with default code.

The target framework for the solution will be the latest .NET framework which is version 7. All other settings will be set to defaults.

When you run the app locally with IIsExpress you will be able to access the swagger interface through the port which you defined in the launchSettings.json. 

https://localhost:7057/swagger/index.html

This file can be located under Properties and there you can configure on which port the application will run. In the profiles section under https settings, you can find the default application URL and port. This will be needed in later steps.

Microsoft provides the below documentation in order to create a containerized application that runs on .NET

Build and run an ASP.NET Core app in a container
In this guide you will learn how to: Create a Dockerfile file describing a simple .NET Core service container. Build…code.visualstudio.com

In order to create a microservice based on our vs solution we will need a dockerfile. This can be created automatically with vs code.

In vs code command dialog search for docker add and select docker compose files to workspace.

Then select asp net core.

and after that your operating system. The next step will be to select the exposed port, or otherwise under which port your application will run. There we should provide the port that we found under our launchSettings.json or the one that we configured manually. In my case I will select the default one for the solution which was 7057.

When a popup window appears on the screen you should select add Dockerfile and automatically the build files will be generated.

Dockerfile

Based on my setup I altered two things in the generated Dockerfile. The first thing will be to change configuration to Debug instead of Release. For production environments you will consider using the release build directive. The second thing will be to add an environmental variable ASPNETCORE_ENVIRONMENT inside the container with the value Development.

FROM mcr.microsoft.com/dotnet/aspnet:7.0 AS base
WORKDIR /app
EXPOSE 7057

ENV ASPNETCORE_URLS=http://*:7057
ENV ASPNETCORE_ENVIRONMENT=Development

FROM mcr.microsoft.com/dotnet/sdk:7.0 AS build
WORKDIR /src
COPY ["AspNetWebApi.csproj", "./"]
RUN dotnet restore "AspNetWebApi.csproj"
COPY . .
WORKDIR "/src/."
RUN dotnet build "AspNetWebApi.csproj" -c Debug -o /app/build

FROM build AS publish
RUN dotnet publish "AspNetWebApi.csproj" -c Debug -o /app/publish /p:UseAppHost=false

FROM base AS final
WORKDIR /app
COPY --from=publish /app/publish .
ENTRYPOINT ["dotnet", "AspNetWebApi.dll"]

docker build command

after the build is completed and the image is created you can run a new container locally.

Keep in mind that in order to test your container you should create a port forward from the container to your host. I used the same port for the host so I added the -p 7057:7057

The logs of the container indicate a successful run of the application.

Our application now runs as a microservice container inside the host machine (my laptop). 

We can verify the access to our application using the URL with the swagger.

Youtube video:

Posted on Leave a comment

Connect Azure Web app container to Keyvault using Managed identity

Following the article on which I described how you can connect to Azure resources through Managed Identity, I will showcase how one can connect through a container running on an App Service (web app) to a keyvault in order to gather secrets from it.

The main two components that are required for this demo will be an app service and a keyvault.

First things first we will need some secrets in order to gather through the hosted application. The dbpassword that is shown below will be retrieved and used from the web app running on the container.

As examined in the article mentioned above, we should construct the appropriate URL in order to retrieve the access_token.

$kati = Invoke-WebRequest -Uri $env:MSI_ENDPOINT"?resource=https://vault.azure.net&api-version=2017-09-01" -Headers @{Secret=$env:MSI_SECRET} -UseBasicParsing | ConvertFrom-Json

Store the access_token on a separate variable (as it sometimes is not parsed correctly from powershell)

and perform an API call on your keyvault using as Authorization the token that we retrieved earlier.

Invoke-WebRequest -Uri "https://spfykey.vault.azure.net/secrets/dbpassword/4f371b23cf244717a585e12af9846dec?api-version=7.3" -Headers @{Authorization = "Bearer $metavliti"} -UseBasicParsing

As a result we sucessfully retrieved the password for the secret which is 123456 by performing a rest api call through the web app using the Managed Identity of the app service.

References:

https://learn.microsoft.com/en-us/rest/api/keyvault/keyvault/vaults

Posted on Leave a comment

Provision gitlab-ce on docker with Portainer

Portainer is a fantastic tool that includes a GUI in order to manage your container workloads easier than with command line. It is free to use with a community edition and the documentation describes the installation which will take one approximately 5 minutes to complete.

In this article I will show you how to use portainer and its GUI to deploy a gitlab container on your setup.

If you use the default setup instructions then your instance will be created on localhost under 9000 port.

docker run -d -p 8000:8000 -p 9000:9000 --name=portainer --restart=always -v /var/run/docker.sock:/var/run/docker.sock -v portainer_data:/data portainer/portainer-ce

You can access it on http://127.0.0.1:9000/ where you will be prompted to login with the credentials you specified during the initial setup.

Under containers you can create a new container by clicking the add container button

Under volumes you should create a new persistent volume which will be consumed from gitlab for data saving operations.

persistent volume creation for gitlab container

You can either create a new container and specify the dockerHub location or pull the image first and then use it to deploy your instance. I preferred the second way so I pulled the image locally.

docker pull gitlab/gitlab-ce

When completed you should see the below message

From containers press +Add new container. Set up the requested name (gitlab) and specify the image.

According to the documentation three partitions are needed in order to store data for gitlab.

$GITLAB_HOME/data:/var/opt/gitlab
$GITLAB_HOME/logs:/var/log/gitlab
$GITLAB_HOME/config:/etc/gitlab

Under env variables add the needed value as described from Gitlab documentation. Personally I used /Users/username/Documents/Gitlab on my computer.

Press deploy container and the creation procedure should start.

When you first launch your container by checking the logs you will verify that the installation steps are running. This operation should take up to 5-10 minutes.

Then you will see your container running. Stop container and perform also the below configurations:

From restart policies, select always:

From ports configuration add the below bindings (80:80, 443:443)

By mapping ports 80 and 443 to your host you will be able to access gitlab from your browser using localhost:80.

Add also your hostname and domain from network tab.

When the deployment is finished, if you access the localhost address you will see the setup screen.

You can find instructions on how to install gitlab through cli or dockerfile from gitlab

The default username that should be used to login is root.

In order to verify that persistent storage is working as expected, create a new test project, commit a file, close container and then start it again.

Stop container

Login to Gitlab again, and your test-project should be there for you.