Pipeline run check from different ADF - azure-data-factory

We need to make sure that below scenarios should be working.
same subscription with different Azure Data Factory
different subscription with different Azure Data Factory
Please provide your pros and cons each statement.

As long as all the subscription are within the same tenant, it won't matter whether the adf pipeline is in same subscription or not. The process to get the status would remain the same

Related

Understanding the job request returned information

I've created the function, which returns JobRequests of agents' jobs through Rest API.
I wonder about some jobs on one agent, which are started at one time but mapped every time to different builds (and when I click on this job in Azure DevOps Web UI - I've got "Build is not found".
I’ve noticed, that all these job requests have the next same parameters:
serviceOwner : 00025394-6065-48ca-87d9-7f5672854ef7
hostId : ffaea179-06ce-45ea-afcd-fde8cd8d156f
scopeId : 6a21ca14-654d-492e-8c3b-83fa3a910dc6
May somebody explain what do these parameters mean (or point to the documentation link, where it is explained).
And how can I get the real Azure DevOps objects using these IDs?
Thank You!

Prevent users from creating new work items in Azure DevOps

I've been looking at organisation and project settings but I can't see a setting that would prevent users from creating work items in an Azure DevOps project.
I have a number of users who refuse to follow the guidelines we set out for our projects so I'd like to inconvenience them and the wider project team so that they find it better to follow the guidelines than not - at the moment we've got one-word user stories and/or tasks with estimates of 60-70 hours which isn't reflective of the way that we should be planning.
I'd still want them to be able to edit the stories or tasks and moving statuses, but that initial creation should be off-limits for them (for a time at least). Is there a way to do this??
The Azure DevOps Aggregator project allows you to write simple scripts that get triggered when a work item is created or updated. It uses a service hook to trigger when such an event occurs and abstracts most of the API specific stuff away, providing you with an instance of the work item to directly interact with.
You can't block the creation or update from, such a policy, Azure DevOps will inform the aggregator too late in the creation process to do so, but you can revert changes, close the work item etc. There are also a few utility functions to send email.
You need to install the aggregator somewhere, it can be hosted in Azure Functions and we provide a docker container you can spin up anywhere you want. Then link it to Azure DevOps using a PAT token with sufficient permissions and write your first policy.
A few sample rules can be found in the aggregator docs.
store.DeleteWorkItem(self);
should put the work item in the Recycle Bin in Azure DevOps. You can create a code snippet around it that checks the creator of the work item (self.CreatedBy.Id) against a list of known bad identities.
Be mindful that when Azure DevOps creates a new work item the Created and Updated event may fire in rapid succession (this is caused by the mechanism that sets the backlog order on work items), so you may need to find a way to detect what metadata tells you a work item should be deleted. I generally check for a low Revision number (like, < 5) and the last few revisions didn't change any field other than Backlog Priority.
I'd still want them to be able to edit the stories or tasks and moving statuses, but that initial creation should be off-limits for them (for a time at least). Is there a way to do this??
I am afraid there is no such out of setting to do this.
That because the current permission settings for the workitem have not yet been subdivided to apply to the current scenario.
There is a setting about this is that:
Project Settings->Team configuration->Area->Security:
Set this value to Deny, it will prevent users from creating new work items. But it also prevent users from modify the workitem.
For your request, you could add your request for this feature on our UserVoice site (https://developercommunity.visualstudio.com/content/idea/post.html?space=21 ), which is our main forum for product suggestions.

How to get sprint wise automated test cases in Azure devops system?

I am in need of getting some automation metrics like, what are the test cases automated in a specific time period in ADS. I tried the below query.
The challenge with this query is: it is retrieving the automated test cases which were automated before this duration. It is because their other attributes are changed like: Title, some tags etc. I was looking for some field that stores the information like(Automated Test Id, Automated Test Name, Automated Test Storage, Automation status). But I didnt' see this. I tried to use "History" field, but that didn't work as it is not populated the automation information in history field.
Any thoughts please?
enter image description here
How to get sprint wise automated test cases in Azure devops system?
At this moment, we could not query the History with State changed from one state to another state on Web UI directly.
As workaround, we could add a custom field in our custom process that gets set the value to true when workitems state change from X to Y.
Then add a Rules to set the value of the custom field to True if the workitems state change from X to Y.
Finally, we could query the workitems with custom field status.
You could check this thread for some more details.
Besides, you can also try do it through PowerBI. Calculate time-in-state for an existing Analytics view.

How to invoke iteration API query in Azure DevOps Services

I am trying to invoke the Rest API below for getting list of work items which are mapping to a specific iteration.
https://learn.microsoft.com/en-us/rest/api/azure/devops/work/iterations/get%20iteration%20work%20items?view=azure-devops-rest-5.1
Ultimately I want to invoke this as part of the Approval step in my Release pipeline so that the approver can verify if all work items for a given iteration are in Completed state.
I have two questions:
I don't know how to get the values for {team} and {iterationid}
How do I invoke this API as part of the approval gate. Should I use a Generic service connection? What user name and password do I need to provide?
Any working example here will be really helpful.
To get teams, you can use Teams - Get Teams on Team Project.
Then you can find all team iterations and their ids Iterations - List
You can use gates with azure function. Here is the example: Azure DevOps release gates with Azure Functions, PowerShell and VS Code
Additionally, you may consider to use quality gates with a work item query: Use approvals and gates to control your deployment

How to copy files and folder from one ADLS to another one on different subscription?

I need to be able to copy files and folder from one DataLake to another DataLake on a different subscription, I'm in possession of both Auth Token and secret key.
I've tried different solution including:
https://medium.com/azure-data-lake/connecting-your-own-hadoop-or-spark-to-azure-data-lake-store-93d426d6a5f4
which is involving hadoop but didn't worked on two different subscriptions, due to the site-core.xml which only accept one subscription.
ADLcopy didn't worked as well, neither DataFactory.
Any ideas? Can anyone point me in the right direction?
Azure Data Factory supports such scenario, you need to create 2 AzureDataLakeStoreLinkedService, each with correct corresponding subscription and resource group (they don't necessarily be the same with the subscription of the data factory), and the credential of the service principal to access ADLS.
If this doesn't answer your question, could you tell more of your scenario, as I don't understand this: " I'm trying to add both secret key and tokens of the two dirrent subscriptions in the core-site.xml", do you mean