Azure Data Factory Event Based Trigger File to Execute the two different environment ADF pipelines - azure-data-factory

I would like to take your advise and approach, how to we implement if we have two different ADF with different subscription or environment, we decided to implement to use trigger file approach to identify the first ADF-A environment pipeline has been completed, so that we would have to automatically start/trigger the ADF-B environment pipeline. this is our goal to implement the solution for our project. kindly someone, please provide the best practice approach and component to implement above that solution.
I would be really appreciating your help for me to lean further on this platform.
Advance Thanks

If your requirement is to trigger a PipelineA when a new file arrives using Event Triggers and then once the PipelineA run is completed successfully then initiate a PipelineB run then you can utilize the REST API using a web activity and initiate a pipeline run.
Approach 1:
To do this in in your PipelineA at the end of all activity have a web activity to make a call to your pipelineB using the REST API - Pipelines - Create Run.
Here is an article by a community volunteer on how to use REST API to trigger a pipeline run - Execute Azure ADF Pipeline using REST API
Approach 2:
The second approach could be, before end of your PipelineA write a dummy file to a blob location and create an event trigger for your pipelineB so that the event trigger looks for file creation in the same location where pipelineA writes a file before completion and as soon as it is created then the second pipelineB starts executing.

Related

How to Split Flow in Power Automate

I use Power Automate and Azure Boards. I am created 7 cloud flows. Now when i created initiative in azure boards all threads start firing at the same time and I have many child tasks. How do I set up my flows so that only the first should be "on create", the rest should be on transition between flow (status change) ?
This my flow:
Based on your requirement, you need to modify the trigger of your Power Automate flow.
How do I set up my flows so that only the first should be "on create"
You can set the trigger: When a work item is created
For example:
In this case, the flow will be triggered when creating work item.
the rest should be on transition between flow (status change) ?
You can set the trigger: When a work item is updated

ADF- Define dynamic triggers

I created a scheduled trigger which executes an ADF pipeline in our development environment.
The trigger runs at 4AM on a daily basis.
I now want to release this process into our test ADF environment but the trigger should kick of at 6AM on a daily basis.
I cant find a method of dynamically changing these values based on environment.
Has anyone come across a solution for this?
Changing trigger settings can be achieved by using ADF utils and custom parameters definitions file, I included an example in this post: https://www.mutazag.com/blog/code/tutorial/ADF-custom-paramaters/#changing-trigger-settings

How can I create a Change Request or Incidents in service now using Java or Xquery or any other script?

I want to create change request using scripts and not from the GUI page. How can I achieve that ? Also, there is a single sign on check of my organization too.
A few options:-
Create Scripted REST API - this exposes a URL which can be called by an external system. Tricky if you are new to SN.
As #Rafay suggestions - setup a scheduled event. Check out https://community.servicenow.com/community?id=community_question&sys_id=b32c0765db9cdbc01dcaf3231f961984
Scheduled events can be scripted/reports or create templates.
Templates can create of tasks without any coding - may be easier if it meets your requirements? These are referred to as templates and can be created via the UI then scheduled to create them (e.g. A regular server patching activity).

How do I retrieve my custom variables from a Bamboo Atlassian Build Plan via REST API

I have a bamboo plan that runs on every commit to a github pull request. In that bamboo plan there are a few custom variables on it such as Git Sha, Github Pull Request Number, etc.
I want to write a script that stops all previous builds (multiple concurrent builds) that have the same pull request number -- same custom variable value.
The reason for this is that if someone makes a quick change to their pull request (comments on the review, etc) that we don't have multiple builds running when only the last one is necessary.
I know it is possible to stop a build with a rest request, but I need a way to be able to get all running builds with custom variable value = 27 (pull request number). Once I know this, I can proceed.
At the time of writing, the REST API documentation doesn't list any method of querying the running builds for a particular build variable.
A solution would be to create your own plugin for Bamboo that exposes a REST service that does this query for you, but I don't know which of the Java APIs you would need to use in order to perform that query.
Here is how I solved this ...
You can call /rest/api/latest/result/<plankey>-latest?includeAllStates=true&expand=variables where plankey is the key for the specific Bamboo build plan.
You then loop through the results you get back, looking for a lifeCycleState value that is not Finished, and a custom variable with the desired name to see if it matches the PR number you have.

WCF WorkFlow Service

I am working on WCF WorkFlow Service Application. i have two ReceiveAndSendReply Activity in sequence. i have set CanCreateInstance to true in both the activities Now i am not able to access the second ReceiveAndSendReply Activity. i know this because its needs to be executed sequentially right ? then how can i create WCF service methods in such way that i can call any of the method anytime.i think that can be achieved by creating State Machine Workflow but how to create in WCF WorkFlow Please suggest me some best ways i can do this.
You can put ReceiveAndSendReply in a pick branch. You can have several branches with several ReceiveAndSendReply activities. But if you have more service calls to your workflow then the recommended way is to use state machine.
Create a parallel activity and add one branch for every receiveandSend activity and set its CanCreateInstance to true