I have built a pipeline with 4 tasks
Task 1 Builds a VM
Task 2 Add a Data Disk
Task 3 Add a Second Data Disk
Task 4 Add a Third Data Disk
However, if I only want Task 1 and Task 2 to execute how can I skip Task 3 and 4? For example, if the user only wants 1 Data Disk. I know they can be disabled manually but is there a way to automate this based on a variable?
Every stage, job and task has a condition property. You can use a condition expression to decide which tasks to run and when. You can reference variables in such expressions. By promoting these variables to a "Queue time variable" you can let a user control these.
Make sure you prepend each condition with succeeded() to make the previous steps have completed succesfully.
condition: and(succeeded(), gt(variables.Disks, 2))
See:
Expressions
Specify Conditions
Define Variables - Allow at queue time
Related
I have a configuration file for the Azure pipeline that is scheduled through the UI to run Mon to Fri. The file has different stages and each stage calls a different template. What I want to do is run different stages/templates in different days of the week.
I tried to save different schedules through the triggers UI, but they need to be applied to the entire file.
I was also reading this https://learn.microsoft.com/en-us/azure/devops/pipelines/process/scheduled-triggers?view=azure-devops&tabs=yaml but again, the schedule would be applied to the entire file.
Is there a way to apply a different schedule to each step?
No, there is an "out-of-box" way to do that. I think you may try to:
Divide your build into several and schedule them separately.
Or
Add a planning step that detects the day and sets an execution step variable like:
echo "##vso[task.setvariable variable=run.step1;]YES"
Set variables in scripts
Then use it in the conditions:
and(succeeded(), ne(variables['run.step1'], 'YES'))
Specify conditions
I have a scenario where I have to execute a pipeline from different pipelines to log the validations.
So I'm planning to have a single pipeline for all the validations (like counts, duplicates, count drops etc..) and this pipeline should be trigger when a particular table execution completes.
for example: There are two pipelines P1 & P2 which both invokes this validation pipeline upon completion. so there is a chance that this validation pipeline may trigger twice at same time.
can we run a pipeline like this? is there any lock will applied automatically?
You can reuse a pipeline which acts as generic pipeline in other pipelines and call them parallelly and there is no lock aspect.
Just that make sure the generic pipeline is allowed parallel executions else it would be in queue
Basically I want two tasks, I want the second task (Task B) to look at the status of first task (Task A).
All the examples I see use yaml, within the Release section of setting up deployments, they all use a user interface.
If I use Agent.JobStatus in Step A or Step B, it shows the Job Status of what we are currently in. I figure I need to capture the value between Task A and Task B (not within either one), how does one capture that? I either can't find it, or not understanding something.
I have put it in the agent job variable expression of Task B....hoping it gathered what was the last job status, but it is null.
in(variables['Agent.JobStatus'], 'Failed', 'SucceededWithIssues')
Instead of manually redeploying a stage, I want to achieve an automatic way to redeploy(can be done manually).
My stage include some disk operations, which sometime fails on the first attempt but usually succeeds on the second attempt.
I am currently re-running the task group into another job in the same stage.
The second job basically executes only if the first one fails.
But this marks the stage as failed as out of two jobs, first one has failed.
But in my case both the jobs are same. Can't find a way to redeploy the same stage.
Your options are basically keep doing what you have or you can replace the steps that fail with custom PowerShell/Bash script that knows how to retry.
Edit: You could improve your existing solution little bit by putting your second attempt in the same job as your first attempt. That way you wouldn't get a failed stage. You can put conditions on steps and not just jobs.
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/conditions?view=azure-devops&tabs=yaml
Can we run task of azure pipeline as per dynamic condition?
Actually i want to run some of the task from the listed task, and every time choice of the task to run maybe different as per user requirement.
For your issue,I think you can set conditions through Control Options in task to control the run of task. If this does not meet your needs, you can give a specific example, so that I can better understand your request.
Inside the Control Options of each task you can specify the conditions under which the task will run.
If the built-in conditions don't meet your needs, then you can specify custom conditions
Conditions are written as expressions. The agent evaluates the expression beginning with the innermost function and works its way out. The final result is a boolean value that determines if the task should run or not. See the expressions topic for a full guide to the syntax.
Here is a document provided some examples,you can refer to it.