ADF- Define dynamic triggers - azure-data-factory

I created a scheduled trigger which executes an ADF pipeline in our development environment.
The trigger runs at 4AM on a daily basis.
I now want to release this process into our test ADF environment but the trigger should kick of at 6AM on a daily basis.
I cant find a method of dynamically changing these values based on environment.
Has anyone come across a solution for this?

Changing trigger settings can be achieved by using ADF utils and custom parameters definitions file, I included an example in this post: https://www.mutazag.com/blog/code/tutorial/ADF-custom-paramaters/#changing-trigger-settings

Related

Azure Data Factory Event Based Trigger File to Execute the two different environment ADF pipelines

I would like to take your advise and approach, how to we implement if we have two different ADF with different subscription or environment, we decided to implement to use trigger file approach to identify the first ADF-A environment pipeline has been completed, so that we would have to automatically start/trigger the ADF-B environment pipeline. this is our goal to implement the solution for our project. kindly someone, please provide the best practice approach and component to implement above that solution.
I would be really appreciating your help for me to lean further on this platform.
Advance Thanks
If your requirement is to trigger a PipelineA when a new file arrives using Event Triggers and then once the PipelineA run is completed successfully then initiate a PipelineB run then you can utilize the REST API using a web activity and initiate a pipeline run.
Approach 1:
To do this in in your PipelineA at the end of all activity have a web activity to make a call to your pipelineB using the REST API - Pipelines - Create Run.
Here is an article by a community volunteer on how to use REST API to trigger a pipeline run - Execute Azure ADF Pipeline using REST API
Approach 2:
The second approach could be, before end of your PipelineA write a dummy file to a blob location and create an event trigger for your pipelineB so that the event trigger looks for file creation in the same location where pipelineA writes a file before completion and as soon as it is created then the second pipelineB starts executing.

Access lastScheduledTime from cron workflow

I'm trying to implement automatic backfills in Argo workflows, and one of the last pieces of the puzzle I'm missing is how to access the lastScheduledTime field from my workflow template.
I see that it's part of the template, and I see it getting updated each time a workflow is scheduled, but I can't find a way to access it from my template to calculate how many executions I might have missed since the last time the scheduler was online.
Is this possible? Or maybe, is this the best way to implement this functionality on Argo?

Run Azure Release Pipeline stage if one of many dependent stages runs

We have a rather large deployment surface, say 10 apps that get deployed. For patch releases we sometimes deploy only one app and I'd like to have a stage run either after all 10 are deployed or if only one is deployed. A simplified graph looks like the following. The "Do Something" step will only run if all three app stages run and I don't want to have to duplicate it for each app so looking for a better way. I guess that I could live with it if it just ran one time on any successful dependent stage (doesn't need to wait for all of them).
So I think there are a couple options for this. Will need to look at YAML Multi stage release pipelines. Specifically, deployment jobs
First depending on complexity of the "Do something" stage, it could be a job template and loaded into each of the app stages. I realize you mentioned you don't want it to run every time so this is just an option.
The second option is if the pipeline is being ran adhoc you will have the ability to select which stages are ran. Thus you could manually select the app stages to run and select the "do something stage" It will look something similar to this. The hard part will be working on the dependencies:
I'd assumed you'd want the "do something" to be dependent on the success of one of the previous stages.
Run Azure Release Pipeline stage if one of many dependent stages runs
I am afraidthere is no such out of way to achieve this at this moment.
That is because there is no "OR" syntax for the depend on. And we could not add the condition for the depend on.
You could submit this request condition "OR" to our UserVoice site (https://developercommunity.visualstudio.com/content/idea/post.html?space=21 ), which is our main forum for product suggestions. Thank you for helping us build a better Azure DevOps.
As workaround:
The main idea of the solution is: You could try to set depend on for the stage Secure with [], then add a Inline powershell task before other tasks. This task will call the REST API Definitions - Get to monitor whether all the stages in the current release pipeline have inprocess and queue states. If so, wait for 30 seconds, and then loop again until all other stages in the current release pipeline have no inprocess and queue states. Then next execute other tasks will be executed.
You could check my previous ticket for detailed info:

Can we prevent Azure LogicApp trigger from firing when deploying

I've created a script to create new LogicApps using PowerShell and deploy them to my Azure ResourceGroup. It uses a schedule recurrence and calls an action on these specific triggers.
The script works and gets the job done (I get to create and update my LogicApp). My issue however is that the LogicApp is triggered every time I do an update in addition of using the recurrence trigger.
Among things I've tried, I've attempted to disable, update then re-enable, but sadly it just triggers as soon as it is enabled back.
The reason around this is I want to run the action on specific time only, and it doesn't match the moment I'm deploying it.
Is there a way to prevent the Logic App from triggering?
If it helps, I am using the New-AzResourceGroupDeployment cmdlet. I've browsed the documentation but there doesn't seem to be any mention as to whether I want to run it immediately or not.
If you set the Start time parameter, that should resolve your issue:

Can a Self Hosted Agent be set as enabled at certain times?

In Azure DevOps, can an agent be set as enabled or disabled at certain times? This is with self-hosted agents. I know there is the toggle in the UI to turn agents on at off as in the screenshot below. But can the enable and disable be scheduled? I'd like to add a user's machine at the weekend for example to help speed up the tests that are run, but obviously we don't want it running tests while in use Monday to Friday.
There's no option to enable/disable agents by schedule. However, you can try to achieve a similar effect with this approach:
Author a new pipeline with a single agentless job. It can invoke the Azure DevOps REST API to change the status of the agent (enabled/disabled). You can find this answer useful as a starting point.
Define the scheduled trigger the way you'd like your agent to be enabled/disabled.