Airflow exclude task from downstream dependency or reference job outside of subdag - triggers

I am currently trying to build a data pipeline in airflow that has many sub-dependencies. I've created subdags (and subsubdags) to achieve the functionality that I want. One thing that I can't figure out, however, is whether I am able to reference a downstream task from within a subdag.
I've included a picture of data pipeline: Data pipeline
task e needs to be triggered by task c but has no downstream dependencies. I can't find a way to reference a task from a higher level within a subdag. Is there a way to do this?
As a work-around, for now I have just placed task e within the subdag with tasks a, b, c, and d. task f should be triggered when task a, task b, task c, and task d are successful, but it should not matter whether task e has succeeded or failed. Can I set task f to have the trigger one_failed and specify the task_id of the accepted failure?
Any help is appreciated! Thanks!

Related

Can we call the pipeline in parallel and have multiple instances running?

I have a scenario where I have to execute a pipeline from different pipelines to log the validations.
So I'm planning to have a single pipeline for all the validations (like counts, duplicates, count drops etc..) and this pipeline should be trigger when a particular table execution completes.
for example: There are two pipelines P1 & P2 which both invokes this validation pipeline upon completion. so there is a chance that this validation pipeline may trigger twice at same time.
can we run a pipeline like this? is there any lock will applied automatically?
You can reuse a pipeline which acts as generic pipeline in other pipelines and call them parallelly and there is no lock aspect.
Just that make sure the generic pipeline is allowed parallel executions else it would be in queue

Can you control how the tasks are executed in Azure Pipelines?

I have built a pipeline with 4 tasks
Task 1 Builds a VM
Task 2 Add a Data Disk
Task 3 Add a Second Data Disk
Task 4 Add a Third Data Disk
However, if I only want Task 1 and Task 2 to execute how can I skip Task 3 and 4? For example, if the user only wants 1 Data Disk. I know they can be disabled manually but is there a way to automate this based on a variable?
Every stage, job and task has a condition property. You can use a condition expression to decide which tasks to run and when. You can reference variables in such expressions. By promoting these variables to a "Queue time variable" you can let a user control these.
Make sure you prepend each condition with succeeded() to make the previous steps have completed succesfully.
condition: and(succeeded(), gt(variables.Disks, 2))
See:
Expressions
Specify Conditions
Define Variables - Allow at queue time

Azure DevOps - Passing Variables in release tasks

Basically I want two tasks, I want the second task (Task B) to look at the status of first task (Task A).
All the examples I see use yaml, within the Release section of setting up deployments, they all use a user interface.
If I use Agent.JobStatus in Step A or Step B, it shows the Job Status of what we are currently in. I figure I need to capture the value between Task A and Task B (not within either one), how does one capture that? I either can't find it, or not understanding something.
I have put it in the agent job variable expression of Task B....hoping it gathered what was the last job status, but it is null.
in(variables['Agent.JobStatus'], 'Failed', 'SucceededWithIssues')

Azure DevOps multi-stage pipeline, trigger specific stage by specific artifact change

The multi stage pipeline looks like this.
A->B->C
Stage A consume artifact a
Stage B consume artifact b
Stage C consume artifact c
Artifact can be repository/pipeline...
How to trigger only Stage B when b artifact change ?
It looks that you need kind of trigger per stage which is not possible to achieve at the moment. Please check this topic on developer community
Sorry I should have clarified a bit more. I don't think that triggers will solve all use cases, but having triggers on stages would at least allow you to have the following:
Stage A has trigger /src/appA
Stage B has trigger /src/appB
If you commited (script, code, etc.) to /src/appB, it should use previous artifacts and only build appB and further if requested.
This is mentioned in the comment and left without feedback from MS, however it is not possible with existing Azure DevOps functionalities.

Check for dependency in two jobs in Zeke Scehduler for Mainframe

This question is specific to Zeke Scheduler for Z/OS.
Is it possible to find if a job is indirectly dependent on another job.
I can check the direct predecessor and successor of the job. But if I want to find if job A is dependent on job B, I have to check all the successor of job B, and then their depdents as well until I find job A, which is not a very easy method nor reliable.
I checked the manual but could not find any direct option. Please do share if anyone knows any option or a workaround.
We can use command
PATH LEV *
to display all levels of predecessor and successors from ZEKE online panel of browsed event.