Can I re-run a Power Automate flow instance from history? - powershell

Is there any way to find and re-run an earlier instance of a Power Automate workflow programmatically?
I can do this manually: download the .csv file containing the instances, search in the Trigger output column the one I want, get the id, copy-paste the run URL, and click resubmit.
I tried with Power Automate itself:
The built-in Flow Management connector supports only to find a specific flow by name, and does not even go to the history.
PowerShell:
Installed the PowerApps module, I can list the instances with
Get-FlowRun -FlowName {flow name}
But I don't see the same properties as in the exported .csv file, and there's also no Run-Flow command that would let me run it.
So, I am a little stuck here; could someone please help me out?

We cannot programmatically resubmit the Flow run from the history with PowerShell or by any other api method yet.
But can avoid some manual work by using workflow function in a Flow compose step, we can automate the composition of Flow history run url. Read more
https://xxx.flow.microsoft.com/manage/environments/07aa1562-fea6-4583-8d76-9a8e67cbf298/flows/141e89fb-af2d-47ac-be25-f9176e64e9a0/runs/08586722084717816659969428791CU12?backUrl=%2Fflows%2F141e89fb-af2d-47ac-be25-f9176e64e9a0%2Fdetails&runStatus=Failed
There are 3 guids that I need to find aso that I can build up the flow history url.
The first guid is my environmentName (07aa1562-fea6-4583-8d76-9a8e67cbf298), then I’ve got the flow name ( 141e89fb-af2d-47ac-be25-f9176e64e9a0) and finally the run (08586722084717816659969428791CU12).

There is a cmdlet from Microsoft 365 CLI to resubmit a flow run
m365 flow run resubmit --environment flowEnvironmentID --flow flowGUID --name flowRunID –confirm
You can also resubmit a flow run using Power Automate REST API
https://api.flow.microsoft.com/providers/Microsoft.ProcessSimple/environments/{FlowEnvironment}/flows/{FlowGUID}/triggers/manual/histories/{FlowRunID}/resubmit?api-version=2016-11-01
For the Power Automate REST API, you will have to pass an authorization token.
For more information, go through the following post
https://ashiqf.com/2021/05/09/resubmit-your-failed-power-automate-flow-runs-automatically-using-m365-cli-and-rest-api/

Related

Is there an easy way to run Azure DevOps PowerShell scripts on my local machine?

I tried to find anything on this but I didn't succeed. Maybe I am using the wrong words for the search.
What I am trying to achieve is that I have a script that can run in an Azure DevOps environment as well as on my local machine for debug purposes. As far as I can see to execute locally I would need some kind of wrapper for the script that is behaving like the Azure DevOps Task is. Does anything like that exist out there?
If you want to have more control over building your code and be able to see intermediate results you need to install self-hosted agent on your machine. Here you have more info about this.
Most of the task are simply wrappers around console tools which adds sort of authorization or making them visually accessible. Maybe useful for you will be enable System.Debug flag on Microsoft agent to see more details what particular task does. You will see more details and thus be able to better understand what is happening behind.
For instance if you use variables in your script like $(someVariable) setting System.Debug you will see your final script in the log with replaced values.
Be aware also that Secret variables are masked. So you may find *** in logs instead of real value.
However, there is no easy way just to extract and wrap what task does to repeat it on your machine without involving Azure DevOps agent.

how pass email in jenkins in run time through Variable

I am facing below problem,Appreciate if any one help
I am having jenkin job which will trigger java jar contains code to read the email from excel and the same email to be passed in jenkins for sending email in username field.
Thanks
Did you take a look at parameterized jobs (If you want to manually trigger it)?
If, you want to read from excel and pass to another job please take a look at this.
I'm trying to understand your problem:
1. Jenkins to trigger emails to DevOps team that the deployment task results
2. Application to trigger emails that the application is being deployed successfully (Or any other scenarios you want to achieve, please indicate, and I will try to enhance this post)
If above is the case, you should try to utilize Jenkins to complete the full process: build, test, deploy and verify, then consolidate the results then send out via email
It's a clear cut that Jenkins for deployment and app for business
There are different ways to verify your application is deployed successfully depends on how you identify if app deployment is successful.
Jenkins can detect those signals, e.g. send a ping or curl to the application and verify the response
Now only Jenkins needs to know the list of emails address for the deployment results, you can use parameterized jobs as #Avneesh Srivastava mentioned

Create QlikView task through command line

I want to be able to create a task for qvw files with a command (cmd, powershell, etc), just as you would through the QlikView Management Console. We would like to be able to automate some of the task creation remotely which would require this functionality.
I know that there are arguments that can be passed through the qv.exe to reload the document, but I want to actually create a task through a command line. Is this possible?
Don't think that this is possible out of the box. You can control QV server through QlikView Management Services API. But for this reason you need to build a .net command line app that will do whatever you want.
If you are interested follow this link for more info.

Get a list of all Resources in my Azure Subscription (Powershell Preferably)

I have an azure subscription and I'm trying to write a powershell script to automatically get a list of all the resources (VMs, Storage Accounts, Databases, etc) that I currently have in my subscription. Is there a way to do this using the azure management REST API or the Azure Cmdlets?
If you are using the new Resource Manager model (introduced in 2014) you can use the following PowerShell script.
Login-AzureRmAccount
Get-AzureRmResource | Export-Csv "c:\Azure Resources.csv"
To use the Resource Manager PowerShell commands you will need the AzureRM PowerShell module (https://learn.microsoft.com/en-us/powershell/azure/install-azurerm-ps).
Install-Module AzureRM
For more information on the difference between Resource Manager and Classic models see, https://learn.microsoft.com/en-us/azure/azure-resource-manager/resource-manager-deployment-model.
For users with multiple subscriptions:
If you want to output the contents of multiple subscriptions then you will need to call Select-AzureRmSubscription to switch to another subscription before calling Get-AzureRmResource.
I don't think there's just one function (or PS Cmdlet) to fetch all this information. However each of these can be fetched through both Windows Azure Service Management REST API as well as Window Azure PowerShell Cmdlets.
Windows Azure Service Management REST API: http://msdn.microsoft.com/en-us/library/windowsazure/ee460799.aspx. For example, if you want to list storage accounts in your subscription, you would use this: http://msdn.microsoft.com/en-us/library/windowsazure/ee460787.aspx
Windows Azure PowerShell Cmdlets: http://msdn.microsoft.com/en-us/library/jj554330.aspx. Again, if you want to list storage accounts in your subscription, you would use this: http://msdn.microsoft.com/en-us/library/dn205168.aspx.
well,
You may update the version of your AzurePowershell and execute this command.
Get-AzureResource
In the output, You may check for "ResourceType".
It has the information about the type of resource creatd on azure.
Since you said PowerShell "preferably", I'm going to assume other options are still maybe useful? You can go to http://portal.azure.com, and click on the Menu icon (three horizontal lines), then All Resources. Then at the top of the page you can click Export to CSV and open that in Excel.
You have to take 30 seconds to do a little cleanup in Excel, but for what I'm trying to do right now, this was definitely the best & fastest solution. I hope it's useful to you (or someone else) too.
Adding to #Gaurav's answer (and related to your comment about SQL database enumeration): You can enumerate all of your databases, on a per-server basis, in a few easy steps.
First, enumerate all of the SQL Database servers in your subscription:
Then, for each server, create a connection context and enumerate the databases. Note that, with the Get-Credentials cmdlet, I was prompted to enter a username + password via a popup, which I don't show here. For demonstration purposes, I created a brand new server, with only a master database, to show what the output looks like:
This sample demonstrates how to automatically get a list of all the resources (VMs, Storage Accounts, Databases, App Services) and status via Powershell by certificate authentication.
https://gallery.technet.microsoft.com/Access-Azure-resource-data-ca9cc9f7
I know it's already been answered however, I have found the Get-AzResource command easy to use and fetches all the resources from a particular subscription. Try using it with "ft" for clean text
Get-AzResource | ft
Screenshot

Deploying to SQL Azure with Powershell - Is there a way to generate the data-loss warning report?

I have a question regarding Data-Tier Application (DACPAC) upgrade when deploying to a SQL Azure database. When we upgrade the DACPAC manually through the wizard UI, there is a step where we review the data-loss warning report and have the ability to save the action report to an HTML file (See here under "Review the Upgrade Plan Page"). The Action column displays the actions, such as Transact-SQL statements, that will be run to perform the upgrade. The Data Loss column will contain a warning if the associated action could delete data.
Right now, I'm automating the database upgrade process using Powershell, which works beautifully so far. Unfortunately, I couldn't find a way for it to generate the same data-loss warning report.
An excerpt of my Powershell upgrade script is below:
## Generate the database change list (database drift) and upgrade script and save them to file.
$dacChanges = $dacStore.GetDatabaseChanges($dataTierAppNameToUpgrade) | Out-File -Filepath .\DatabaseChanges.txt
## Getting the DAC incremental upgrade script for data-tier application
$dacStore.GetIncrementalUpgradeScript($dataTierAppNameToUpgrade, $nextDacType) | Out-File -Filepath .\DatabaseUpgrade.sql
The DatabaseChanges.txt output file generated by GetDatabaseChanges() wasn't really informative, so we are wondering if there's a way to get the same report file as the one we would get if we were to go through the upgrade wizard manually. This report has been a great help to the deployment team when resolving data migration issues, and we would like to be able to inspect it manually when we deploy to a live production database.
We've searched through the MSDN documentations but didn't have any luck.
Does anyone know if this feature is supported for Powershell deployments?
Is it a plan that this will be supported in the near future?
Thanks for your help in advance.
According to http://technet.microsoft.com/en-us/library/microsoft.sqlserver.management.dac.dacupgradeoptions.aspx, by default IgnoreDataLoss is false, which means if data loss is detected, the upgrade will fail. In .NET, you can handle the DacActionFinished event (http://technet.microsoft.com/en-us/library/microsoft.sqlserver.management.dac.dacstore.dacactionfinished.aspx), which will fire when the upgrade operation finishes. If something goes wrong, the event arg’s Error property (http://technet.microsoft.com/en-us/library/microsoft.sqlserver.management.dac.dacactioneventargs.error.aspx) will contain the detailed exception. It may not be the html report you want, but it will still give you some information. You can handle CLR events from PowerShell. Please refer to http://blogs.msdn.com/b/powershell/archive/2008/05/24/wpf-powershell-part-3-handling-events.aspx for a sample.
By the way, actually the issue is not only specific to SQL Azure. You can also modify your tag to SQL Server to get further suggestions.
Best Regards,
Ming Xu.
Just got in touch with the owner of DAC # Microsoft, and here's the solution he suggested: it is possible by using their managed API or through SqlPackage.exe.
SqlPackage.exe has an action on it to produce a deployment report. If there is a possible data loss issue detected it will be included in the report.
Reference: http://msdn.microsoft.com/en-us/library/hh550080(v=VS.103).aspx