Azure data factory - Continuous deployment using VSTS - deployment

I need to know how can i build continuous deployment for Azure Data factory using VSTS. I know there is an Azure data factory deployment available in VSTS release. But I'm looking for other options using Powershell for deployment.
If anyone has already done anything specific to this provide the links.

This blog should get you started. I'm using a comparable method for deployment. Before deploying the JSON files using a PowerShell command, I edit them to insert environment specific values into the Data Factory definitions. You can pass these values as parameters from the TFS deployment-pipeline.

Related

Azure Data Factory CI/CD Improvements integrated with DevOps

I've been reading the following link.
https://learn.microsoft.com/en-us/azure/data-factory/continuous-integration-delivery-improvements
It mentions using npm to build the Data Factory ARM templates and use the resulting artefact to deploy to UAT/Prod etc instead of using the adf_publish branch.
Has anyone got a sample Yaml file that does this?
Also, how would you handle overriding the ARM Template Parameters Json file for changing over parameterization such as the Environments Key Vault etc e.g. Dev-KV -> UAT-KV, Prod-KV
This is what I did. I followed this article to get my yaml file setup, there is a github repo in the article that has all of this persons code.
Azure Data Factory CI-CD made simple: Building and deploying ARM templates with Azure DevOps YAML Pipelines
Then I used Global parameters and referenced those everywhere in my pipelines. Here is a ref for that: Global parameters in Azure Data Factory
And finally, I used the overrideParameters option in my yaml pipeline to deploy the correct version of the parameter to the correct environment. Here is a ref for that: ADF Release - Set global params during deployment
Hope that helps!

Get all the related datasets, LS, pipelines, and resources related to a pipeline in Azure Data Factory

I am trying to migrate a pipeline from a data factory with pipelines/ds/ls related to other pipelines. To do this, I want to find all the related ds/ls/resources to the pipeline that I want to migrate to a different data factory(differnet env). What would be the way to do so? Secondly, how would you do it using ARM Template in release pipelines?
You can simply accomplish this task by importing/exporting ARM templates. This will export your datasets, Linked Services and pipelines settings. But if you are changing your data source and destination, you need to change them separately.
To export the pipeline configuration as a template just go to that pipeline and click on three dots on right side and click on Export template option.

Azure DevOps CI-CD pipeline for Azure Data Factory

I have created a CI-CD pipeline to deploy pipelines from my development data factory to production data factory and it works perfect for the first run. However, when a change is made in the dev, for example,
renamed pipeline1 to pipeline001
the production data factory has a pipeline by the name pipeline001 and also pipeline1, both of which are the same. The issues persists for any kind of change and not just restricted to renaming.
I checked the json templates, which was found ok.
Any help?

Azure Data Factory - download Json Defitions

So I am facing the following problem: I have a bunch of Azure Data Factory V1
Pipelines in one specific data factory, these pipelines, each have, around 400 data sets.
I need to move all of them to a new resource group / environment and put their json definition in a git repo.
So my questions is, how can I download all the pipelines definitions for a data factory and all the data sets definitions in their json format from Azure?
I don't want to click each one and copy-paste from the Azure UI, as it will take ages.
Call Rest API is good way for both V1 and V2. See this doc.
For ADF V1, you can try using Visual Studio.
Connect via Cloud Explorer to your data factory.
Select the data factory and choose Export to New Data Factory Project
This is documented on SQL Server Central.
Another thing to try is to have Azure script out an ARM template of the Data Factory.

Best practice for scripting Azure resource creation

I'm creating a test environment in Azure. I want to have an accurate script of what of the configuration so it's easy to replicate for other test, pre-prod and prod environments later on. The environment has an existing subscription, and I want the entire hierarchy of resources from Resource Groups through to Web Apps to be created by script.
I'm currently rolling my own script in PowerShell utilising AzureRm. This is working well, but I can't help feel I'm reinventing the wheel. What is the existing method for creating an entire Azure environment by script?
Yes, that way is called Azure Resource Manager Templates. Quote:
With Resource Manager, you can create a template (in JSON format) that defines the infrastructure and configuration of your Azure solution. By using a template, you can repeatedly deploy your solution throughout its lifecycle and have confidence your resources are deployed in a consistent state. When you create a solution from the portal, the solution automatically includes a deployment template. You do not have to create your template from scratch because you can start with the template for your solution and customize it to meet your specific needs. You can retrieve a template for an existing resource group by either exporting the current state of the resource group, or viewing the template used for a particular deployment. Viewing the exported template is a helpful way to learn about the template syntax.
Reference: https://learn.microsoft.com/en-us/azure/azure-resource-manager/resource-group-overview#template-deployment
Edit: you can use powershell, azure cli, azure cli2, azure sdk to deploy those templates (or simply Azure portal, search for "Template Deployment")