There could be times that you will need to get the standard output from a script or command in powershell and handle this information. An example that I use is the output of docker info which I use to monitor the service state along with other things. The output of the docker info will inform you about an error during connection if the engine is not working. In the latest version of docker they changed the way message is displayed and this information will be printed on standard error instead of standard output.
docker info
As a result when you try to find the connection state through a powershell command you will fail identifying the correct state.
(docker info).Contains('ERROR: error during connect')
This behavior is noticed because the error is printed on standard error.
In order to bypass and get the standard error on the output, you will need to use 2>&1 along with the command.
docker info 2>&1
After that you can handle the output message appropriately.
Sometimes you may end up with wrong results on powershell because of the return object. A detailed demonstration can be located below where the return object is not a string and the evaluation of equals is not correct.
For example lets assume that we need to check docker status from powershell and catch this result through the string that is returned. When docker is not running you can expect a similar message like the below.
By getting the result of the docker info command into a variable we can see that the return object is of type Object in powershell.
When you try to use the contains functions with this object in order to evaluate the docker status you will end up with a false result as is not evaluated correctly.
In order to resolve this issue you should specify that the result should be a string with Out-String function.
Then when you evaluate the expression with Contains function this is performed as expected and the correct result is returned.
Azure Batch can be a great tool for instant batch processing as it creates and manages a pool of compute nodes (virtual machines), installs the applications you want to run, and schedules jobs to run on the nodes. The important thing using this service is that there is no additional charge for using Batch. You only pay for the underlying resources consumed, such as the virtual machines, storage, and networking.
In this post I will demonstrate how one can create a new job and task from az cli for batch service. The trick in this implementation will be the json that is provided as input for the task definition as not all available options are provided from az cli.
One important missing configuration will be the container image that can be provided in the task trough Azure portal but not with az cli.
In order to create a task using az cli and bypass this issue, you can use the json-file parameter. This option will trigger the creation using the rest api and provide the parameters for the container image.
When there is a batch service pool available, you will need to create a job.
az batch account login -g RESOURCE_GROUP -n NAME
az batch job create --id JOB_NAME --pool-id POOL_NAME
Then you can create a new task using a json file.
az batch task create --job-id JOB_NAME --json-file
Considering that you have a docker container that runs an operating system, you could install docker inside it in order to use docker commands. Lets take for example the below Dockerfile. This will use the windows server core image and will install docker on it.
FROM mcr.microsoft.com/dotnet/framework/sdk:4.8-windowsservercore-ltsc2022
As a result the administrator could execute docker commands by taking a prompt on the container. However not all commands will work if you do not perform the below volume binding. When you spin up a new container using the image that you created with the Dockerfile you should also use the below command. This way you could use docker commands like docker build, docker push etc.
Privacy & Cookies: This site uses cookies. By continuing to use this website, you agree to their use.
To find out more, including how to control cookies, see here:
Cookie Policy