Is there a built-in variable in Argo Workflows for previous/next scheduled time? - argo-workflows

According to Argo Workflows docs, there is a workflow.scheduledTime built-in variable but I found neither a variable for the next scheduled time nor for the previous one.
In Airflow for example, there are macros that can be used to get those values.
Is there any way to get those values in Argo rather than calculate them manually inside the step itself?

Related

Trigger Date for reruns

My pipelines activities need the date of the run as a parameter. Now I get the current date in the pipeline from the utcnow() function. Ideally this would be something I could enter dynamically in the trigger so I could rerun a failed day and the parameter would be set right, now a rerun would lead to my pipeline being rerun but with the date of today not the failed run date.
I am used to airflow where such things are pretty easy to do, including scheduling reruns. Probably I think too much in terms of airflow but I can't wrap my head around a better solution.
In ADF,it is not supported directly to pass trigger date at which pipeline got failed to trigger.
You can get the trigger time using #pipeline().TriggerTime .
This system variable will give the time at which the trigger triggers the pipeline to run.
You can store this trigger value for every pipeline and use this as a parameter for the trigger which got failed and rerun the pipeline.
Reference: Microsoft document on System Variables on ADF
To resolve my problem I had to create a nested structure of pipelines, the top pipeline setting a variable for the date and then calling other pipelines passing that variable.
With this I still can't rerun the top pipeline but rerunning Execute Pipeline1/2/3 reruns them with the right variable set. It is still not perfect since the top pipeline run stays an error and it is difficult to keep track of what needs to be rerun, however it is a partial solution.

How to provide dynamic values for approvals and checks in yaml pipelines?

I'm working on an integration between Azure Pipelines and ServiceNow's change management module. To achieve that the ServiceNow Change Management extension has been installed and configured according to this documentation page. In Azure DevOps we are using multistage yaml pipelines, which should create standard preapproved changes in ServiceNow.
The connection itself between the two applications works fine, I managed to put together a pipeline that creates change requests, waits until their status changes and then closes them. However, I'd like to pass some values set in the pipeline runs to the created change requests and I couldn't find a way to do it.
First I added a service connection to our Azure DevOps project, and created the ServiceNow check for it. I experimented a little with adding different expressions to it, like setting the short description to ${{ parameters.shortDescription }}, or defining a variable in the pipeline as ShortDescription: ${{ parameters.shortDescription }} and using that variable in the check as $(ShortDescription) or $[ variables.ShortDescription ]. Unfortunately none of these expressions got resolved. I also realized it is possible to use the predefined variables, but the values I'd like to set are not possible to describe by predefined variables. For example, selecting an assignment group would be pretty straightforward from a parameter defined as a list, but impossible to select from predefined variables.
So as a next idea, I tried to link a variable group to the check and update the variables through logging commands. Even though the variables from the group got resolved, they only showed the values I set them through the UI as a static default value. The dynamic values set via the logging commands were not visible. I played around some time and verified that I can update the definition of the variable groups through Azure CLI or REST API, so I can add new variables or update existing ones. Thus I tried to add a new variable to the linked group during the pipeline run named as ShortDescription_$(Build.BuildId). Even though it got added properly, I could not use it within the check, because it required double variable resolution, like $(ShortDescription_$(Build.BuildId)) and this expression was not resolved, not even partly. It remained $(ShortDescription_$(Build.BuildId)).
Then I started thinking about using only one variable from the group with a static name (e.g. ShortDescription) for all pipeline runs. However, I feel it would create a race condition and could cause some inconsistencies.
So as a last resort, I tried to put together an extension with an Agent and a ServerGate task, which are capable of storing the values I want to pass to change request and reading the stored values in an agentless environment. The problem here is, that the second task is not visible as a check for service connections. It's there as a release pipeline gate and looks good there, but I can't utilize it that way. Based on a question I found, this does not seem to be the problem with my task. To verify it, I copied the content of the same ServiceNow check I used before, and added it to my extension as a contribution with a different task id. And it did not show up as the question stated.
Which means now I can either
create a change request through my custom server task (as the ServerGate task can be used properly in yaml if it is changed to a Server task), but that way I can't wait for the state change of the ServiceNow ticket, or
create the change request in a separate stage where I want to use it, update it first in the same stage where I created it via the first-party check and wait for the state change in the stage where I would normally create it.
The second can work, but it has its own problems, like having misleading values stored in the changed request for the stage id field, or not having multiple change requests created for multiple run attempts of the deployment stage. Also I feel like it's not how the extension's task and check should be used.
Unfortunately, I'm out of ideas how this dynamic value passing can be achieved, if it's possible to do so in the first place. Could you please help me by sharing ideas, or pointing out errors in my attempts?

Argo Workflow: create a conditional cyclic DAG

I want to create a conditional cyclic DAG in Argo Workflow.
My use case: I have a final human-approval step, if rejected, I want to workflow go back to some specific node and rerun from there.
Why rerun makes sense in my case: there're some human inputs in the workflow, if the final result is rejected, they might want to change some parameters and rerun.
Why not rerun the whole workflow: some steps are expensive.
I know cycle is not even a valid word when we're talking about DAG. But I haven't come up with a proper way to handle my usecase in Argo Workflow. Any suggestions?

Pass Azure devops release pipeline(Classic editor) output variable to multiple jobs in same stage or to multiple stages outside

I am using the release pipeline classic editor and have a requirement of passing an output variable generated in a task to multiple jobs in the same stage or to outside stages. Currently, this output variable is available only inside the same job and I have to write the same task in multiple jobs and stages and I feel it is a redundancy. Is there any way to implement it?.
In the Classic editor, I am afraid that output variables in a different job is not feasible. Please refer to this document .
As a workaround , you can use variables across jobs and stages via variable groups.
First define the variable in the variable group, then update the variable group through rest api or azure cli, and replace the defined variable with the value of the variable generated by the task.
PUT https://dev.azure.com/{organization}/{project}/_apis/distributedtask/variablegroups/{groupId}?api-version=5.1-preview.1
Here is a case about update variable group with powershell script.
Another workaround : You can share values across all of the stages by using release pipeline variables. The solution is updating the Release Definition for the Release Pipeline variable in the Stage where the variable is set.
Define a variable in the release definition Variable.
Use REST API Definitions - Update to update the value of the
release definition variable in the agent job.
Use the updated value of the release definition variable in the next
agent job.
The details info about using REST API to update the value of the release definition variable, you can follow this ticket.
For detailed steps and guide, please refer to this blog .

Can I schedule a workflow in CQ5?

In CQ5, there is an option to schedule page activation on a particular date. I want to be able to do the same with a workflow — that I can initiate/queue it today, but it will only start executing its steps on a specified date.
Is this possible to implement this feature via a custom workflow step, using the Workflow API? Or is there another way this could be done, e.g. using Sling Events/Scheduling?
There's a process step called the AbsoluteTimeAutoAdvancer which reads a property named absoluteTime from the WorkflowData's MetaData. This is expected to be numeric long value which represents the activation time since Epoch in milliseconds.
The trick is to set this value in the metadata. I would suggest reading
extending workflows the section entitled Saving Property Values in Workflow Metadata