Azure Pipeline Trigger - triggers

I've recently built a pipeline in ADF. I placed a trigger that launches it every day at a specific time.
But if my computer goes off, the pipeline doesn't trigger. Is it necessary to keep the computer on, knowing that everything happens in the cloud? If that is the case, can you advise the alternative?

Is it necessary to keep the computer on
No this is unnecessary. Once you published the whole data factory, there should be no relationship with your own machine.
Refresh your page, and then see whether the trigger is still in the data factory.
If still not work, please contact technical support of microsoft.

Related

Can we prevent Azure LogicApp trigger from firing when deploying

I've created a script to create new LogicApps using PowerShell and deploy them to my Azure ResourceGroup. It uses a schedule recurrence and calls an action on these specific triggers.
The script works and gets the job done (I get to create and update my LogicApp). My issue however is that the LogicApp is triggered every time I do an update in addition of using the recurrence trigger.
Among things I've tried, I've attempted to disable, update then re-enable, but sadly it just triggers as soon as it is enabled back.
The reason around this is I want to run the action on specific time only, and it doesn't match the moment I'm deploying it.
Is there a way to prevent the Logic App from triggering?
If it helps, I am using the New-AzResourceGroupDeployment cmdlet. I've browsed the documentation but there doesn't seem to be any mention as to whether I want to run it immediately or not.
If you set the Start time parameter, that should resolve your issue:

Prevent users from creating new work items in Azure DevOps

I've been looking at organisation and project settings but I can't see a setting that would prevent users from creating work items in an Azure DevOps project.
I have a number of users who refuse to follow the guidelines we set out for our projects so I'd like to inconvenience them and the wider project team so that they find it better to follow the guidelines than not - at the moment we've got one-word user stories and/or tasks with estimates of 60-70 hours which isn't reflective of the way that we should be planning.
I'd still want them to be able to edit the stories or tasks and moving statuses, but that initial creation should be off-limits for them (for a time at least). Is there a way to do this??
The Azure DevOps Aggregator project allows you to write simple scripts that get triggered when a work item is created or updated. It uses a service hook to trigger when such an event occurs and abstracts most of the API specific stuff away, providing you with an instance of the work item to directly interact with.
You can't block the creation or update from, such a policy, Azure DevOps will inform the aggregator too late in the creation process to do so, but you can revert changes, close the work item etc. There are also a few utility functions to send email.
You need to install the aggregator somewhere, it can be hosted in Azure Functions and we provide a docker container you can spin up anywhere you want. Then link it to Azure DevOps using a PAT token with sufficient permissions and write your first policy.
A few sample rules can be found in the aggregator docs.
store.DeleteWorkItem(self);
should put the work item in the Recycle Bin in Azure DevOps. You can create a code snippet around it that checks the creator of the work item (self.CreatedBy.Id) against a list of known bad identities.
Be mindful that when Azure DevOps creates a new work item the Created and Updated event may fire in rapid succession (this is caused by the mechanism that sets the backlog order on work items), so you may need to find a way to detect what metadata tells you a work item should be deleted. I generally check for a low Revision number (like, < 5) and the last few revisions didn't change any field other than Backlog Priority.
I'd still want them to be able to edit the stories or tasks and moving statuses, but that initial creation should be off-limits for them (for a time at least). Is there a way to do this??
I am afraid there is no such out of setting to do this.
That because the current permission settings for the workitem have not yet been subdivided to apply to the current scenario.
There is a setting about this is that:
Project Settings->Team configuration->Area->Security:
Set this value to Deny, it will prevent users from creating new work items. But it also prevent users from modify the workitem.
For your request, you could add your request for this feature on our UserVoice site (https://developercommunity.visualstudio.com/content/idea/post.html?space=21 ), which is our main forum for product suggestions.

Feature State Updating Automatically

Feature cycle time is a very important metric, but in ADO there doesn't seem to be a way to get the State of a Feature to automatically update when the first story moves into Active (or the last child is closed). Does anyone know of a way to have this happen?
No, that will never happen. This area for custom application and solutions. You can try the following:
TFS Aggregator
Write your own solution through rest api: Automation of state changing for Azure DevOps work items based on states of child work items
Use additional solutions: Automation of state changing with Azure Logic App

ADF activity is in progress state for a longer period of time

I have a daily schedule ADF pipeline that consist of couple of activities. The first activity basically invokes a stored procedure that takes less than 2 second to run on DB side but the activity is running continuously(In Progress state for longer period of time). Initially I thought it was due to the blocking on SQL server side. But the problem is there is NO SQL SERVER hit from ADF side. The strange thing is running on Dev environment but getting blocked in QA.
Here is the RunId for the same:
Thanks!
Sometimes you can invoke a stored procedure with a copy activity, changing the SqlReaderQuery property to something like "exec 'sp_name'". I explained this here: Execute storedProcedure from azure datafactory
Your problem looks like the integration runtime cannot reach the database, try opening the integration runtime, and connecting to the database from the diagnostic tab. Once you make it work there, make sure the linked service is configured with those credentials.
Hope this helped!

CRM 2011 RU13 The workflow cannot be published or unpublished by someone who is not it's owner

I have created and added some workflows to CRM 2011 RU13, through the UI
Through not fault of my own my development environment is completely air gapped from my production environment.
I added these workflows to my solution and exported the solution as managed and given the solution to the production admin.
When he deploys it fails with this message.
The workflow cannot be published or unpublished by someone who is not it's owner
How do I fix this. There is no way to not give workflows an owner. or say that the owner is the solution.
The production admin gets that message because he is not the owner (inside the target CRM environment) of one or more active workflows included in your solution.
This happens in these situations:
First time you give your solution to be imported, is USER_A to
perform the operation and all the workflows are assigned automatically
to him. If later USER_B try to import an updated version of the
solution he gets the error message because is not the owner of the
workflow(s).
First time you give your solution to be imported, is USER_A to
perform the operation and all the workflows are assigned
automatically to him. Meanwhile one or more workflows are assigned to
USER_C. If later USER_A try to import an updated version of the
solution he gets the error message because is not the owner of the
workflow(s).
Before a workflow can be updated must be first deactivated, and only the owner can deactivate a workflow. This is by design.
In your case the production admin must be the owner of the processes (he can assign temporarily the workflows to himself, import the solution and after assign back to the right user) or needs to be the owner of the workflows to import the solution (if he has the rights)
A couple of additional points for clarity for the OP:
The owner of the workflows in your dev environment is not relevant, the importing user in prod will become the owner (this does not contradict Guido, I'm just making sure you don't follow a red herring). It is quite right for their to be an "air gap" between dev and prod.
If you know which workflows are in your solution, assign those in prod to yourself, then import, then if and only if you need to, reassign them to the original owner(s).
You may not need to if that owner is just an equivalent system admin user, but if it is a special user (eg "Workflow daemon" so users can see why it updated their records) you will want to re-assign.
Note that after re-assigning them, that user has to activate the workflows. You cannot activate a workflow in someone else's name (or users could write workflows to run as admins and elevate their priviledges).
If the workflows have not actually been changed in this version of your solution, take them out of the solution and ignore them - often I find that a workflow has been written, carried across to production in the original "go live" and is then working perfectly fine, but is left in the solution which is constantly updated and re-published (ie export / imported).
Personally I often have a "go live" solution (or more than one, but that's a different thread...) and then we start all over again with a new solution which only contains incremental changes thereafter. This means that all your working workflows, plugins, web resources etc do not appear in that solution so this avoids confusion as to versions, reduces solution bloat, and avoids this problem of workflow ownership. If a workflow is actually updated, then you need to deal with the import, but don't make this a daily occurrence for unrelated changes.