Saving a file to a staging folder in Azure DevOps - powershell

I'm executing a Powershell script in an Azure DevOps release pipeline. In the script some JSONs are saved to a local directory, which later in the script are then uploaded to Azure Blob Storage.
However, of course, Azure DevOps doesn't see my local directory to save to. Can I save the JSON files to say a staging folder in the Azure DevOps agent? Or if not where can I save to when the script is run in the release pipeline? Thanks

You can save the json file to a folder in azure devops agent. When the agents run your pipeline, it will create a working space with below folders in the agent machine.
You can point to these folders in your pipeline by referring to the predefined environment variables.
For example :
$(Build.ArtifactStagingDirectory) is mapped to folder '/_work/2/a'
$(Build.BinariesDirectory) is mapped to folder '/_work/2/b'
$(System.DefaultWorkingDirectory) is mapped to folder '/_work/2/s'
You can also save the json file to a new folder(eg. staging folder) within '/_work/2'.
For example $(Agent.BuildDirectory)/staging. A new folder staging will be created within folder/_work/2.
For more informatin about predefined variables. Please check here. For release variables please check here.

Related

Is there a way to name the azure agent working folders

When you set up an azure devops agent on a build machine, it will have a working folder (by default _work) where it will create subfolders for each pipeline that it has to run.
These folders have integer names like "80" or "29". This makes it hard to trouble shoot issues on a given build machine, when you have many pipelines, as you don't know which folder it each pipeline relates to.
Is there a way to figure out the mapping from pipeline > folder number, or to name these folders more explicitly?
Rename the folders is currently not supported in Azure DevOps.
Each pipeline maps a folder in the agent-> _work.
1.You could check the pipeline log to figure out which folder is your pipeline's working folder. (Enable system diagnostics)
2.You could also add a command line task in your pipeline to echo this directory.
echo $(System.DefaultWorkingDirectory)

How to backup solution folder before releasing the new build to on-premise server in Azure CI/CD Pipeline

I would like to keep the solution backup to some folder before releasing the new build in my Azure CI/CD pipeline. And to deploy the solution, I am using my on-premise server. Not sure where to make changes to save existing artifacts to some folder before getting the new release.
This way we can avoid the failure if something goes wrong to current release. We can have the backup to copy the file.
Release Pipeline
I would like to keep the solution backup to some folder before releasing the new build in my Azure CI/CD pipeline.
Based on your requirement, you can use the Copy files task to copy the existing files on the server to the target folder.
Here are the steps:
Step1: you can check the IIS deployment Physical path in IIS web app manage task.
For example: %SystemDrive%\inetpub\wwwroot
Step2: you can add the Copy file task at the top of the Release Pipeline. Then you can set the Physical path and target folder to copy the existing files.
For example:
In this case, when the release pipeline is running, it will backup the existing files first and then it will deploy new files to local server.
On the other hand, since you are using CI/CD pipeline, the related files will be saved in Build artifacts. You just need to change the build artifacts version in Release Pipeline, then you can deploy the previous artifacts again.

How to exclude files present in the mapped directory when Publishing the artifacts in Azure CI/CD?

I am new to Azure CICD pipelines and I am trying to export the CRM solutions using Build pipeline in azure devops using Power Platform Task. There is a requirement to keep the exported solution from build pipeline to Azure repos. (which I am doing it from command line using tf vc)
I am able to export the solution successfully but the issue is when I publish the artifacts it publishes every file present in the mapped folder. (mapped a directory in Azure repos where all the solution backups are kept)
I see that azure agents copies all the files present in the mapped directory and stores in agent directory. The problem is the mapped directory contains all the backup files of CRM Solutions. I found some articles where it was mentioned to cloak the directory so that the files will not be included in azure agent. But if I cloak the directory then I am not able to check-in the exported solution from command line.
So, I was wondering if there is any way to exclude all files present in the mapped directory and still able to check-in the exported file to that directory using command line.
You can use a .artifactignore file to filter out paths of files that you don't wish to be published as part of the process.
Documentation can be found here

how to upload data to storage account from CICD?

We need to save the "artifacts" from the build/release pipeline. How do we upload some data into a storage account from the deployment pipeline?
For exaample, here is my data:
{"mydata":"some stuff"}
I'd like to upload this into mystorageaccount/mystoragecontainer/fea8e047-0dc4-4f96-8499-aee5c3929be3.json.
How do we create blobs from the CICD pipeline?
I have tested in my environment
I created build and release pipelines to copy data to the Azure Blob Storage
I followed these steps :
I created a file in a folder in Azure Repos
Now, I created a build pipeline with two tasks: Copy Files, Publish Artifact
I configured Copy Files task to copy files from the folder I created in the Azure Repos to Azure Artifacts
I configured Publish Artifact task to publish the Artifact data to Azure Pipelines
Now I ran the pipeline and the files in the folder got copied to the Artifact and the Artifact data is published to the Azure Pipelines
Now, I created a release pipeline with task as Azure File Copy to deploy the artifact data to Azure Blob Storage
I configured the Azure File Copy task with version 3.0, source as the artifact data, Destination Type as Azure Blob, Azure Service connection, storage account and the container names
Now I ran the release pipeline, and the file is created in the Azure Blob Storage

How to deploy app service with pipeline variables as an artifact to be downloaded?

I have an app service which is deployed to Azure WebApp for testing (this works just fine), but since this eventually shall be deployed to an on-premises solution I need to create a deployment package that I can download from either Azure Portal or from DevOps.
So far I have tried creating a Releases pipeline which picks up the build artifact and use the AzureBlob File Copy task to copy the artifact from DevOps to a storage account in Azure. The problem I have now is the the File Copy task does not set the varialbes I have in the Variable groups into the appsettings.json file (such as DbConnection and port settings).
What would be the best way to create a deployment package (with updated appsettings.json values) to be available for download either from Azure Portal or DevOps, without the need to create a dedicated app service in Azure for the deployment?
This is the steps I have at the moment, but as mentioned the configuration property for setting the varialbes are not available for the AzureBlob File Copy:
Pipeline Tasks and Variables
How to deploy app service with pipeline variables as an artifact to be downloaded?
You could try to use the task File transform (Preview) to update the appsettings.json file with the value in the variable groups:
Then we could use the Azure File Copy task to copy the artifact from DevOps to a storage account in Azure.
Note: The Source of the Azure File Copy task should be use the $(System.DefaultWorkingDirectory) instead of select the repo file: