how to upload data to storage account from CICD? - powershell

We need to save the "artifacts" from the build/release pipeline. How do we upload some data into a storage account from the deployment pipeline?
For exaample, here is my data:
{"mydata":"some stuff"}
I'd like to upload this into mystorageaccount/mystoragecontainer/fea8e047-0dc4-4f96-8499-aee5c3929be3.json.
How do we create blobs from the CICD pipeline?

I have tested in my environment
I created build and release pipelines to copy data to the Azure Blob Storage
I followed these steps :
I created a file in a folder in Azure Repos
Now, I created a build pipeline with two tasks: Copy Files, Publish Artifact
I configured Copy Files task to copy files from the folder I created in the Azure Repos to Azure Artifacts
I configured Publish Artifact task to publish the Artifact data to Azure Pipelines
Now I ran the pipeline and the files in the folder got copied to the Artifact and the Artifact data is published to the Azure Pipelines
Now, I created a release pipeline with task as Azure File Copy to deploy the artifact data to Azure Blob Storage
I configured the Azure File Copy task with version 3.0, source as the artifact data, Destination Type as Azure Blob, Azure Service connection, storage account and the container names
Now I ran the release pipeline, and the file is created in the Azure Blob Storage

Related

Azure DevOps : I want to add a task in my pipeline that can copy some files from my Azure Repo into an On premise VM. Any leads?

I have a requirement to create an Azure DevOps pipeline that can copy files from my Azure Repo to a path on an On-premise VM (a SQL server to be precise). Could anyone advise on how to get started on this?
You would need to add a checkout task to the pipeline. You would define the repo as a source and then add a step to checkout the repo. Here's some documentation concerning checking out multiple repos using yaml that should get you started

Require to store Publish Artifacts of each stages in single directory over azure pipeline

Each stages stores its own publish artifacts, but can we store it in a common place where we keep placing publish artifacts of every build and download from there as an when require ?
And also that common storage should not be azure repository nor azure blob storage, instead simply over azure pipeline.
Publish pipeline artifact out of the box supports to publish artifact to file share on Azure Storage Account. However if you want to put your arifacts outside of Azure you need to consider external provider like Artifactory. There is no other options out of the box. However keep in mind that your aritifacts are mostly archive so you can even upoad then to FTP if you want, however the question is actually why you need this and if benefit is greater than effort.
Require to store Publish Artifacts of each stages in single directory over azure pipeline
I am afraid it is impossible to store Publish Artifacts of each stages in single directory over azure pipeline.
That because the azure pipeline is used for building and publishing, not for storage. It will only temporarily store the artifacts generated by the pipeline associated with the pipeline.
For your situation, you could create a network folder, then publish the artifact to the network folder for each stage:

How to deploy app service with pipeline variables as an artifact to be downloaded?

I have an app service which is deployed to Azure WebApp for testing (this works just fine), but since this eventually shall be deployed to an on-premises solution I need to create a deployment package that I can download from either Azure Portal or from DevOps.
So far I have tried creating a Releases pipeline which picks up the build artifact and use the AzureBlob File Copy task to copy the artifact from DevOps to a storage account in Azure. The problem I have now is the the File Copy task does not set the varialbes I have in the Variable groups into the appsettings.json file (such as DbConnection and port settings).
What would be the best way to create a deployment package (with updated appsettings.json values) to be available for download either from Azure Portal or DevOps, without the need to create a dedicated app service in Azure for the deployment?
This is the steps I have at the moment, but as mentioned the configuration property for setting the varialbes are not available for the AzureBlob File Copy:
Pipeline Tasks and Variables
How to deploy app service with pipeline variables as an artifact to be downloaded?
You could try to use the task File transform (Preview) to update the appsettings.json file with the value in the variable groups:
Then we could use the Azure File Copy task to copy the artifact from DevOps to a storage account in Azure.
Note: The Source of the Azure File Copy task should be use the $(System.DefaultWorkingDirectory) instead of select the repo file:

How do i get my Azure DevOps release pipeline to get artifacts from Azure Storage Account

Im using Azure DevOps and have setup a "Release" pipeline, not a "Build" pipeline, and I want Release Pipeline to get is Artifacts from my Azure Storage Accounts.
The Artifacts have already been built and are Nuget package (.nupkg) files. I have copied them into an Azure Storage Account as File Storage. All they need to do is be use by a release pipeline.
So my question is how do I get my Azure Release Pipeline to get these files and use them in the Release?
There isn't any native way to download automatically the binaries from a storage at the beginning of the release, you will have to add your own tasks to download it from the release (and add the connection string as a variables).
The usual pattern to share generated files between a build and a release is to use an Azure DevOps Artifact. You will need to add a the "Publish Build Artifacts" task to your build and then you will be able to link it to your release by clicking "+ Add" on the artifact panel.

Saving a file to a staging folder in Azure DevOps

I'm executing a Powershell script in an Azure DevOps release pipeline. In the script some JSONs are saved to a local directory, which later in the script are then uploaded to Azure Blob Storage.
However, of course, Azure DevOps doesn't see my local directory to save to. Can I save the JSON files to say a staging folder in the Azure DevOps agent? Or if not where can I save to when the script is run in the release pipeline? Thanks
You can save the json file to a folder in azure devops agent. When the agents run your pipeline, it will create a working space with below folders in the agent machine.
You can point to these folders in your pipeline by referring to the predefined environment variables.
For example :
$(Build.ArtifactStagingDirectory) is mapped to folder '/_work/2/a'
$(Build.BinariesDirectory) is mapped to folder '/_work/2/b'
$(System.DefaultWorkingDirectory) is mapped to folder '/_work/2/s'
You can also save the json file to a new folder(eg. staging folder) within '/_work/2'.
For example $(Agent.BuildDirectory)/staging. A new folder staging will be created within folder/_work/2.
For more informatin about predefined variables. Please check here. For release variables please check here.