Conditional component - google-cloud-data-fusion

Is there a way to
get a particular pipeline's, say P1, status (failed / completed) in conditional component in pipeline P2?
Can we call a pipeline from conditional component?
Usecase:
I have functional pipelines F1, F2, F3 etc and audit pipelines as audit_success and audit_failure. If I can get F3's status in 1 single audit pipeline, I can have 2 branches in same pipeline thereby avoiding creation of 2 pipelines.

There is no conditional component that checks for another pipeline's status. However, you can achieve this through pipeline triggers, but as you mentioned it does require two different pipelines.

Related

Can we call the pipeline in parallel and have multiple instances running?

I have a scenario where I have to execute a pipeline from different pipelines to log the validations.
So I'm planning to have a single pipeline for all the validations (like counts, duplicates, count drops etc..) and this pipeline should be trigger when a particular table execution completes.
for example: There are two pipelines P1 & P2 which both invokes this validation pipeline upon completion. so there is a chance that this validation pipeline may trigger twice at same time.
can we run a pipeline like this? is there any lock will applied automatically?
You can reuse a pipeline which acts as generic pipeline in other pipelines and call them parallelly and there is no lock aspect.
Just that make sure the generic pipeline is allowed parallel executions else it would be in queue

Azure DevOps multi-stage pipeline, trigger specific stage by specific artifact change

The multi stage pipeline looks like this.
A->B->C
Stage A consume artifact a
Stage B consume artifact b
Stage C consume artifact c
Artifact can be repository/pipeline...
How to trigger only Stage B when b artifact change ?
It looks that you need kind of trigger per stage which is not possible to achieve at the moment. Please check this topic on developer community
Sorry I should have clarified a bit more. I don't think that triggers will solve all use cases, but having triggers on stages would at least allow you to have the following:
Stage A has trigger /src/appA
Stage B has trigger /src/appB
If you commited (script, code, etc.) to /src/appB, it should use previous artifacts and only build appB and further if requested.
This is mentioned in the comment and left without feedback from MS, however it is not possible with existing Azure DevOps functionalities.

Is it possible to use $(RELEASE.TRIGGERINGARTIFACT.ALIAS) as a condition to trigger different release stages?

My problem is almost same as this question. However, I don't really get the comments or the solutions proposed there.
Say I have two artifacts: A1 and A2.
A1---> Dev stage
A2---> UAT stage.
What I want is that when A1 is released, only Dev stage is deployed. But at the moment, when A1 is rleased, Dev and UAT are triggered.
looking at the comments from previous question,
how can $(RELEASE.TRIGGERINGARTIFACT.ALIAS) be used as a trigger condition? Looking at filter, I can't see a place to put custom condition
If I create a third artifact as suggested, how can I set different tag base on if A1 or A2 is built?
If there other ways to solve this?
Say I have two artifacts: A1 and A2.
Open Dev stage ->click Agent job->expand Additional options and select Custom condition using variable expressions-> add the condition eq(variables['RELEASE.TRIGGERINGARTIFACT.ALIAS'], 'A1'). The Dev stage only runs when the value of RELEASE.TRIGGERINGARTIFACT.ALIAS is A1, otherwise this stage will be skipped.
The same configuration in UAT stage, the condition is eq(variables['RELEASE.TRIGGERINGARTIFACT.ALIAS'], 'A2')
In my release pipeline, I update the Artifact _test to trigger this release pipeline and the variable RELEASE_TRIGGERINGARTIFACT_ALIAS is _test, it runs Stage 1 and skip Stage 2
This might not be the exact answer, but it still can work as a solution to your problem.
You can tag your builds depending on the produced artifact name. If the build produces artifact A1, add a tag A1 to it. That way you can set up release stage conditions based on artifact filters like this:
For the Dev stage trigger, filter build tags by A1 and for the UAT trigger, filter by A2.

Pass Azure devops release pipeline(Classic editor) output variable to multiple jobs in same stage or to multiple stages outside

I am using the release pipeline classic editor and have a requirement of passing an output variable generated in a task to multiple jobs in the same stage or to outside stages. Currently, this output variable is available only inside the same job and I have to write the same task in multiple jobs and stages and I feel it is a redundancy. Is there any way to implement it?.
In the Classic editor, I am afraid that output variables in a different job is not feasible. Please refer to this document .
As a workaround , you can use variables across jobs and stages via variable groups.
First define the variable in the variable group, then update the variable group through rest api or azure cli, and replace the defined variable with the value of the variable generated by the task.
PUT https://dev.azure.com/{organization}/{project}/_apis/distributedtask/variablegroups/{groupId}?api-version=5.1-preview.1
Here is a case about update variable group with powershell script.
Another workaround : You can share values across all of the stages by using release pipeline variables. The solution is updating the Release Definition for the Release Pipeline variable in the Stage where the variable is set.
Define a variable in the release definition Variable.
Use REST API Definitions - Update to update the value of the
release definition variable in the agent job.
Use the updated value of the release definition variable in the next
agent job.
The details info about using REST API to update the value of the release definition variable, you can follow this ticket.
For detailed steps and guide, please refer to this blog .

VSTS pass parameter in Pipeline between stages

I have two stages in my pipeline. The first one is the trigger for the second one. I want the parameter from first stage be input/accessible in the second stage.
Is this feasible?
EDIT
The case is when the parameter value is set on stage 1 as of result of ARM or script output, then this value is not visible on next stages.
Thanks
It is not possible this way to share variable from stage1 to stage2 in case when you are changing variable value in stage1, to achieve this you need to persist this value in some storage like for eg. keyvault, azure function or vsts api that you will change value for . The stage1 and stage2 can be running on different agent. What can be done is to edit variable value for stage2, in stage1 add task that is persisting this value using (vsts/tfs/azure devops) api and persisting this on release definition. Api for update release definition https://learn.microsoft.com/en-us/rest/api/vsts/release/definitions/update?view=vsts-rest-4.1