set up dependencies between 2 separate ADF environments - azure-data-factory

I want to know, in order to set up dependencies between 2 separate ADF environments.
I know this is possible through setting a trigger/web activity.
Is there is any point of failures in the scheduling the inter ADF pipelines? We need to be 100% sure for this solution.

There are multiple ways to set up dependencies:
You can trigger an ADF pipeline via web activity thereby linking those 2 ADFs
You can generate a file in blob at the end of 1st pipeline and configure event trigger for another pipeline

Related

Custom Release Pipeline Task with UI Control for Azure Devops

I am looking for a way to add a simple custom task with a button to the releases pipelines of Azure Devops (not the new yaml based pipelines).
What I basically need to do is pause the deployment and provide button for a user to click that will bring them to a separate HTML page I am hosting in an S3 bucket. The button link would be dynamic (set at deployment time).
I create a small json file with Cloud Formation stack details and save it during the first agent job. From there I set a pipeline release variable.
resultsUrl=$(jq -r .resultsUrl deploy-$(Release.ReleaseId)-$(Release.EnvironmentName)-stackInfo.json)
echo "##vso[task.setvariable variable=deploymentResultsUrl;]$resultsUrl"
Right now I have the Manual Intervention task in place - but since that is an agentless job - it does not pick up the environment variables I set in the previous agent job. The variable would need to be set at the time the release is triggered - but I won't know it then.
I know we can extend the UI for the new yaml based pipelines (I have already authored a few in house extensions). I need this to be in the "classic" release pipeline. That is where all (100s) of our deployment definitions live.

How to make and control of parallel execution of Azure DevOps Pipeline?

I am using Windows Self hosted agent for my Azure DevOps pipelines. Currently the pipelines are executed sequentially. If more than one pipelines triggered from different ADO projects, then it has to wait in queue to get the agent. In order to execute the pipeline in parallel, I came to know from some tutorials if we increase the paid parallel jobs for self hosted agent under billing section of Organization setting. Is my understanding correct? If so what are the precautionary steps I need to take. Do we have any control of when the pipelines to be executed in parallel?
Thanks.
In order to run self-hosted parallel jobs, you need to purchase parallel jobs and register several self-hosted agents.
For parallel jobs, you can register any number of self-hosted agents in your organization. If you want to run 3 jobs in parallel, then you must register at least 3 self-hosted agents in one agent pool. DevOps charges based on the number of jobs you want to run at a time, not the number of agents registered. There are no time limits on self-hosted jobs. For private projects, you can have one job and one additional job for each active Visual Studio Enterprise subscriber who is a member of your organization.
About how to purchase parallel jobs, please refer to Buy parallel jobs.
For how to control the use of parallel jobs, please refer to the following:
For classic pipeline, you can specify when to run the job through dependencies and Run this job in Additional options in the agent job. Then the pipeline will run in sequence according to your settings.
For YAML pipeline, you can specify the conditions under which the job should run with "dependsOn" and "condition".
For example:
For more info about conditions, please refer to Specify conditions
If you don't specify a specific order, the jobs will run in parallel based on the parallel jobs you purchased.
I don't know if my experience can help. I'll try. I started a new job and we use self-hosted TFS / Azure DevOps. I am changing our build process to create 3 product SKUs (it uses conditional compilation). Let's call them Good, Better & Best.
I edited the Build definition. First I switched to the Variables tab. I created a Process variable named SKUs and set it to Good,Better,Best. The commas are important.
Next I switched to the Tasks tab. I located the Agent Phase. Mine was called Phase 1. Select it. On the right, under Parallelism, I selected Multi-configuration. In the Multipliers text field I entered SKUs. I set Maximum number of agents to 3.
What I don't yet know is the TFS back-end administration and options that the company purchased beforehand.

How to run multiple Copy Files task in a Azure DevOps Release pipeline simultaneously with Custom Conditions?

I am using Azure DevOps Server 2020 and I have a release pipeline which has around 21 copy file tasks in it to copy the output of multiple microservices to different target paths and this takes almost around 23 mins to complete the release pipeline.
I want to optimize the release pipeline and save some time and thus I am thinking of running all the copy task simultaneously.
Under the copy tasks in Control Options section, I see Run this task option is available where we do have the option to define custom conditions but I am not sure which custom conditions do I need to define exactly so that all my copy tasks gets executed parallelly.
Could anyone please let me know what custom conditions will allow all the copy task to get executed in one go?
Currently it is not possible to have tasks run in parallel. It has been raised as a suggestion here but the feature hasn't been implemented
How to run multiple Copy Files task in a Azure DevOps Release pipeline simultaneously with Custom Conditions?
Just as TheWinterCoder pointed, Currently it is not possible to have tasks run in parallel.
But, as a workaround, you could divide the replication task into several different jobs and make the jobs run in parallel:
This requires you to have multiple agents available in the local agent pool:

Build agent metric in Azure Devops pipelines

We pay for a number of Microsoft hosted build agents in Azure pipelines. We have a lot of build pipelines, where many of them do jobs in parallel.
Are there any metrics I can use to see the utilization of the build agents and even more interesting, how many jobs are in queue for a free build agent?
Since this would be for the whole Azure Devops instance the Dashboard feature doesn't seems to be appropriate because it only seems to hold project specific metrics.
Got to your Organization Settings-Parallel jobs blade. This will give you the ability to view the jobs in progress.
As for metrics there is a public preview just came out for this; however, I do not have it available yet.
Agent pool usage data is sampled and aggregated by the Analytics service every 10 mins. The number of jobs is plotted based on the max number of running jobs for the specified interval of time.
This feature is enabled by default. To try it out, follow the guidance
below.
Within project settings, navigate to the pipelines “Agent pools” tab
From the agent pool, select a pool (e.g., Azure Pipelines) Within the
pool, select the “Analytics” tab

How to apply customer-specific configuration during VSTS release?

We would like to try building a release pipeline for our product in VSTS - however, our product requires a separate instance of the application per customer (there are some legacy in the picture here :)). What we THINK we want, is a process like this:
For each customer:
Update DB schema
Configure a container, with customer-specific configuration etc.
Publish the container into Azure Container Registry
Deploy the container in Azure Container Service (OR on-prem if the customer runs on-prem)
The configuration can be multiple things: Extensions of the API in the application (new DLLs basically), connection strings, ...
I figure we can do this fairly easily using a custom PowerShell script, but I would like to not write anything custom (at least for the "looping" issue) if I don't have to. We could also create separate environments in VSTS for each customer, but that seems quite unmaintainable with well over 100 customers.
Some additional details:
- There's a separate DB per customer
- There's two separate web applications per customer
So what's the best practice here? Any advice? Thanks! :-)
You could think of doing it in two ways.
1 - By creating one environment for each customer. So you could have the exact same tasks for each environment, or have the flexibility to change steps in a particular environment.
This approach would give you also the ability to use a flow pipeline, because your build will be released only after is passes your internal QA and other processes.
To do it easily, you could also create task groups to reuse then in each environment.
2 - The other way is to create create separate releases for each customer or group of customers. This will also give you the same flexibility, you can use your builds, but you have to add some extra steps to make sure you are using the right build, since you can choose any build when you create a release, which you can do mannualy.
Updated
A third option could be to create on environment for all customers and then have the one deployment agent installed for every customer, using all of them on the same deployment group. Then have one file with all your variables per customer, with the file named with the agent name, and a powershell script that uses the agent name variable to find what file to run. This powershell script would then run all your individual configurations.
In that case, I suspect that you would end up doing almost all your deployment in powershell, which could be more time consuming for you to maintain. You also have to keep in mind that in this particular scenario you would update all your customers the same time, because all agents would be on the same deployment group.