You can use python SDK in order to retrieve blob files from a storage account on azure. First you will need to get your connection string for the storage account from the access keys section.
Then you can execute the below python code.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
For some data processing scenarios I had to create an automation that would download some files from a storage account, perform actions on them (python, custom tools) and lastly upload the processed files again in the storage account.
A high level diagram is visible below:
In order to automate this scenario I used a custom devops agent on azure devops and assigned a managed identity on this agent (virtual machine) on the storage account in order to interact with it without using credentials.
Then I only used powershell and az cli to download and upload the files on the storage account.
The three pipeline tasks that are required to perform the upload, processing, download actions can be found below.
The json object is used to download a specific file based on the requirements for example the first entry on chronological order. This is why the sort-object-descending is used.
Given that you have a blob storage container with multiple files, you could download the most latest one easily with az cli
In my scenario I have a container named backups which includes multiple MS SQL backups. I wanted to download the latest in order to restore through a pipeline on SQL server.
In order to accomplish that you should first login with az cli.
az login
If this is not possible through automation you would have to create a managed identity for your resource. The code is uploaded on the below gist.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
In this article I will demonstrate how you can write a simple .NET console application to write data inside a blob container on an Azure storage account.
Storage accounts support the below four data structures.
For the cases of an application, I had to write every x seconds data on a file in order to get the coordinates of an app. For this purpose I selected a storage account container on which I will store the Latitude and Longitude in a .json format.
Inside my storage account I created a container which I will use named application.
Inside application container I created a file named locations.json. This will be the file that will get updated from the mobile app.
My console app consists of two .cs files. One is the location class which will be my model for the serialization.
public class Location
{
public Double Latitude { get; set; }
public Double Longitude { get; set; }
}
And the main code which consists of two functions. The RetrieveBlobContents function can read the stream content of a requested file from the blob. As I have specific names for my scenario, I hard coded them on the application. My containerName is application and the file which I want to read is called locations.json
The WriteContentOnBlob function will serialize a random Location object and will update the file every 5 seconds. The result is accomplished by writing a file locally and pushing this local file on the blob storage. The content of the file will be first extracted on the current directory/data folder.
Program.cs
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Privacy & Cookies: This site uses cookies. By continuing to use this website, you agree to their use.
To find out more, including how to control cookies, see here:
Cookie Policy