Azure Data Factory ADF Unit Testing on feature branch without having to deploy changes to the data factory instance - nunit

Is there a way to unit test individual pipelines in Azure Data Factory on a particular branch without having to deploy my changes. Currently the only way I am able to run unit tests on ADF pipelines is by publishing my changes to the data factory instance and kick off a pipeline run. However this approach requires me to merge my changes to the collaboration branch before I am able to execute any pipeline test cases.
Ideally I'd like to be able to kick off a pipeline on particular feature branch without having to deploy to the default instance, so that I can validate my test case and make adjustments before merging it with the collaboration branch.
Any suggestions people can give or resources they can point to?

I think you can try going for Automated Unit Testing for ADF, this will enable you to write a code and the test runs before deployment.
You can check a sample for the same here

At the time of writing it is not possible to command the Azure Data Factory API to run pipes in debug mode. You can call the published pipes through the REST API. If you do not want to modify the published configuration of Azure Data Factory you would need to have multiple data factories.
We use 1 ADF per engineer, and 1 ADF per higher level environment.

Related

Azure Data Factory development with multiple users

can any one help me how to lock pipeline in ADF, is there any option when one developer is working other should not work, as with multiple developers are working on same pipeline without using Source Control
unfortunately there is no feature in Azure portal for Azure data factory to lock the pipeline changes if 2 or more are working on the same pipeline. You would have to create a clone of existing pipeline and work on those clones else the best way is to use source control like git

How to automate Azure data factory pipeline deployments

I want to automate Azure data factory pipeline deployments.
I have Self Hosted Integration runtimes with a different name in each environment (i.e. SHIR-{environment}).
I have different data sources and destinations for each environment. (i.e. different SQL server names or Hostnames)
How can I perform the automatic weekly deployments to promote changes from GitHub dev branch to stage and stage to production? I don't want to modify these database server names in linked services during the GitHub PR merge.
To set up automated deployment, start with an automation tool, such as Azure DevOps. Azure DevOps provides various interfaces and tools in order to automate the entire process.
A development data factory is created and configured with Azure Repos Git. All developers should have permission to author Data Factory resources like pipelines and datasets.
A developer creates a feature branch to make a change. They debug their pipeline runs with their most recent changes. For more information on how to debug a pipeline run, see Iterative development and debugging with Azure Data Factory.
After a developer is satisfied with their changes, they create a pull request from their feature branch to the main or collaboration branch to get their changes reviewed by peers.
After a pull request is approved and changes are merged in the main branch, the changes get published to the development factory.
When the team is ready to deploy the changes to a test or UAT (User Acceptance Testing) factory, the team goes to their Azure Pipelines release and deploys the desired version of the development factory to UAT. This deployment takes place as part of an Azure Pipelines task and uses Resource Manager template parameters to apply the appropriate configuration.
After the changes have been verified in the test factory, deploy to the production factory by using the next task of the pipelines release.
For more information follow this link

Selective Deployment in Azure Data Factory (ADF)?

I am using npm package based CI-CD approach for ADF. I want to selectively deploy some pipelines and datasets on prod, instead of deploying everything in repository.
Is there any powershell script where I can send list of ADF objects which I want to deploy using my CI-CD pipeline?
Instead of powershell, if there is any other way, please let me know that as well.
As per official documentation, Data factory entities depend on each other.
For example, triggers depend on pipelines, and pipelines depend on datasets and other pipelines. Selective publishing of a subset of resources could lead to unexpected behaviors and errors.
On rare occasions when you need selective publishing, consider using a hotfix.
Steps to deploy a hotfix

How to configure Azure DevOps Pipeline decorators to run pre-tasks in classic pipelines?

We have a custom Azure DevOps extension to in order to inject SonarQube pipeline tasks into every definition using the Pipeline Decorator feature. These tasks are a mixutre of both pre and post tasks.
In YAML defined pipelines, the tasks run perfectly, however in Classic pipeline definitions, only the post tasks run, although the classic and YAML pipelines are defined identically (steps, agents, demands, variables etc.).
As this is a relatively new feature of Azure DevOps, there is a lack of documentation, especially regarding classic pipelines.
Is there something that we could possibly be missing for this to happen?
Is there something that we could possibly be missing for this to
happen?
This seems the issue on our side. And, it only exists to the sonarcloud/sonarqube prepare task if we apply it into Decorator.
As you know, users use yaml template for the steps to inserted at the specified location. And in fact, on our backend, this template file is processed through yaml template engine.
As our design, after you enable the Pipeline decorators at organization level. In Initialize job, Pipeline will call one backend class to get the JobContext, which will add decorator providers to JobContext. Then JobContext use these providers to fetch contributions to add pre/post tasks in job while preparing the job to run.
BUT, the sonar prepare task can not actively be detected by engine, then inject it into JobContext. For why I point to this specific task, because this kind of abnormality only exists in the prepare task of sonarcloud and sonarqube until now.
Our team will do some investigation and fix with sonar team together.
Until now, there has 2 work around you could consider to apply.
Work around 1:
As I mentioned previously, this prepare task can not actively be detected and injected into JobContext. So, the first work around we actively add this task info into JobContext via adding prepare task into agent job.
But this will cause one disadvantage is, it will load 2 prepare tasks. One is executed in pre-job, and next it will run second.
Work around 2:
Try to use YAML to build your pipeline until we implement this abnormality thing. So that it will not cause error because of lacking prepare task
Will update the status here to let you know once we have any progress.

Deployment scenario of git integrated Azure Data Factory via arm template

What happens if you have multiple features being tested in test environment of a ADF V2 test data factory and only one or few of them is ready for production deployment. How do we hande this type of deployment scenario in Microsoft recommended CICD model of git/vsts integrated adf v2 through arm template
Consider we have dev test and prod environment of ADF v2. The dev environment is git integrated. The developers have debuged their changes and merged with collaboration​ branch after pull request. The changes are published and deployed to test environment first. Here many features are getting tested but few are ready for prod and few are not, how do we move the ones which are ready since tge arm template takes the entire factory?
this is somewhat of a strange question. you can apply same logic to anything, how do you create a feature for an application since application is only deployed as a single entity. answer would be: use git flow or something akin to that. Use feature branches and promotions.