Azure Data Factory - LiveCodeBackup - powershell

I have 150 pipelines and many datasets defined in Azure data factory. I want to take backup of live code everyday and save it to the code repository.
What is the best approach to backup the Code/pipelines/datasets/linked services to GIT repository. Do we have any API's available for the task?
Can we achieve this with PowerShell code? if yes, please share the PS code if you have it handy.
Appreciate your help..

Set up a code repository in the Azure data factory, then you can save all your resources to Azure DevOps or GitHub and get the Backup.
Please follow the below reference it has detailed expiation of GitHub code repository setup and Restores, backup of Azure data factory.
You can use another way to take Backup and Restore Azure Data Factory using an ARM template.
Reference:
https://www.sqlshack.com/using-source-control-in-azure-data-factory/
How to Backup and Restore Azure Data Factory from ARM Template ?
https://www.youtube.com/watch?v=X5uMYO06aMI
How to Backup and Restore Azure DevOps code repositories ?

Related

Azure Data Factory development with multiple users

can any one help me how to lock pipeline in ADF, is there any option when one developer is working other should not work, as with multiple developers are working on same pipeline without using Source Control
unfortunately there is no feature in Azure portal for Azure data factory to lock the pipeline changes if 2 or more are working on the same pipeline. You would have to create a clone of existing pipeline and work on those clones else the best way is to use source control like git

Azure DevOps : I want to add a task in my pipeline that can copy some files from my Azure Repo into an On premise VM. Any leads?

I have a requirement to create an Azure DevOps pipeline that can copy files from my Azure Repo to a path on an On-premise VM (a SQL server to be precise). Could anyone advise on how to get started on this?
You would need to add a checkout task to the pipeline. You would define the repo as a source and then add a step to checkout the repo. Here's some documentation concerning checking out multiple repos using yaml that should get you started

How to deploy SQL Database in Bitbucket pipeline to Azure

Asking for an opinion or direction on the current problem.
We are using bitbucket pipeline to deploy ci/cd web applications to Azure. Now what is remaining - the database, also being hosted on Azure.
From my research - everything on SQL Database Projects deployments usually utilizes Azure DevOps pipelines (connects to github repo, allows plural environments, has a built-in SqlAgent allows deploy SQL db to the target server via dacpac file. It allows CI with every check-in, every time you push changes. Nice!
But what if can not (for some reason) use Azure DevOps and have to utilize Bitbucket pipelines instead. is that possible? how? via scripting? a tool? to call in the command line? Any help - highly appreciated.
It's true that in Azure DevOps it is easier to deploy (Azure) SQL Database, as Azure DevOps offers many tasks (including 3rd party custom tasks you can find in Microsoft MarketPlace).
However, no matter what tool will you use, you should be able to do the same, knowing the concept of deployment of a specific service.
I don't know BitBucket very well, but I bet the product has the capability to execute some commands, including PowerShell commands as well. If so, you must do 2 steps in your pipeline to publish Azure SQL database:
1) Create server and (empty) database - perhaps BitBucket offers some task for creating services in Azure (from ARM template or other way). If not - you can always use CLI or PowerShell to do so. More info: az cli server
2) Deploy the database or changes to it. This step is always to compare DACPAC file (which is compiled version of SQL Server database project) to target database on the server. The result is T-SQL (differential) script which must be executed against the target database. There is only one way to do so - sqlpackage.exe - tool provided by Microsoft. You can find the whole documentation here and plenty of examples on how to use it on the Internet.
Let me know if that helps.

How to read code from Azure DevOps Git repository contents from within Azure Data Factory v2

I am not trying to set up a git repository for my data factory -- rather I want my data factory to read some script files from already existing DevOps git repository. I didn't find any information about this. Mostly the information available is about how to set up git repository for ADFv2. There is also no connection type for Azure Dev Ops. Is there any workaround for this? My goal is to make my data factory more dynamic by fetching some scripts directly from DevOps git repo.

CI CD components information neeeded

We have a requirement of implementation of CI CD for the below PAAS components via VSTS Build -release pipelines:
1)Azure automation
2)Azure Data Warehouse
3)ADF v1
4)ADF v2
5)Key vault
6)Azure Storage
Any document pertaining to any of the above components would be very helpful.
I am looking for documentation specific to the build & release pipelines.
Any help is much appreciated.
Its not really a good idea to post such a broad question like this. But have you tried the official documentations?
Azure Devops
azure pipelines
Azure KeyVault
Azure Storage
Azure SQL Data Warehouse
Azure Active Directory V.1
Azure Active Directory V.2
Also try out the chat service for azure devops at aka.ms/devchat. They usually are helpful. :)