Is it possible to execute WF Workflows in REST API/Web API. I am planning to use "WorkflowApplication" object to run a workflow. For each request to start a WF, a separate instance of the "WorkflowApplication" object will be used to run a workflow. I haven't seen any articles which says I can use WorkFlows in REST API. Do you see any issues in doing this?. If yes, what alternatives I have?
Related
I am using a Logic App for which I need to create a custom connector. This connector depends on a web service, for which I am trying to add using wsdl definition.
Now If I provide the url, it needs authentication, which I am not able to provide via this UI. I can see the parameters can be provided while using it in the logic app. However it fails to pull the services and hence not creating the definition for the connector
I tried downloading the wsdl and adding here as a file, but the schema have xs import tags, because of which its failing again. And as per this answer, I can not replace it with actual schema.
<xs:import namespace="http://some.name/" schemaLocation="./path/to/it.xsd"/>
Is there a way that I do not need to provide the custom connector definition manually and make it work using wsdl, as it contains a lot of endpoints and it would be too much to add all actions and triggers manually. Plus it would be also reference for me if needed in future for such scenario
You may try this if the services are accessible over the internet, then you call service endpoint over HTTP or HTTPS from azure logic apps. This article will help you with details steps to be followed: https://learn.microsoft.com/en-us/azure/connectors/connectors-native-http
If it is not accessible over the internet then this article will help with step by step process: https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-gateway-connection
Before you can access data sources on premises from your logic apps, you need to create an Azure resource after you install the on-premises data gateway on a local computer. Your logic apps then use this Azure gateway resource in the triggers and actions provided by the on-premises connectors that are available for Azure Logic Apps.
Also check this
I would like to be able to call Powershell scripts using a REST API. (Please note that I am describing the _opposite_ of calling a REST API from Powershell.) Are there any prebuilt API gateways that support this use case? I've looked at Ocelot, but it currently only acts as a gateway to other REST APIs. Ideally I would simply design my Powershell script functions to follow a defined interface pattern, put the files into a defined directory, and the API gateway would either immediately make those functions available as REST API calls or with minimal configuration.
EDIT: To clarify, I am looking for something self hosted, not cloud based. I haven't found anything yet that is exactly what I need, I may create something myself.
You can try AWS Lambda and API gateway integration.
Here is an example: https://aws.amazon.com/blogs/developer/creating-a-powershell-rest-api/
Amazon offer 12 month free tier plan for this.
A couple of options. If you are on Azure you could expose your Powershell Scripts through Azure Automation :
https://learn.microsoft.com/en-us/azure/automation/automation-webhooks
That'd be a lightweight way of having your scripts enabled through a HTTP POST scenario.
You could also combine or mix it with adding API Management in front to support various scenarios (adding GET/PUT/DELETE support e.g.) or even automate or proxy more things. API Management could of course also be automated.
https://azure.microsoft.com/en-us/services/api-management/
You could also create a folder structure with modules & sub-functions and create a full REST API by using Azure Functions with PowerShell:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-reference-powershell
The latter would also be able to execute in containers & in the supported Azure Function
runtimes.
Is there a way (via powershell, Azure DevOps' REST api, or via the UI) to pull a list of builds/releases run in the past X time on a specific Agent Pool? I haven't found any documentation to indicate a method yet.
Get a list of builds/releases for an agent pool?
There is no such out-of-the-box API at this moment, because the agent REST API is undocumented, see REST API Overview for Visual Studio Team Services and Team Foundation Server for more information.
However you could use tools such as Fiddler to track the the API, following below steps to get list of builds/releases for an agent pool with REST API:
Get Pool ID:
GET https://dev.azure.com/<YouOrganizationName>/_apis/distributedtask/pools/
Get Agent ID based on the pool ID:
GET https://dev.azure.com/<YouOrganizationName>/_apis/distributedtask/pools/5/agents/
Get the job requests the specific Build Agent:
GET https://dev.azure.com/<YouOrganizationName>/_apis/distributedtask/pools/5/jobrequests?agentId=4
Now, we could use scripts to list the those builds/releases info, like, "requestId", "result" and so on.
The helped ticket: Retrieving a list of agent requests from TFS REST API
Note:
These are undocumented so you should be vigilant while upgrading your
TFS if you are taking dependencies on these.
Hope this helps.
I wants to know the status of all the task executed. We can do that manually, but i wants to know it through REST Client(Google composer).
Composer environment API currently doesn't support retrieving Airflow-level metadata info. As a workaround, you may want to use Airflow task infoAPI. Admittedly the Airflow API isn't comprehensive, but if you know the {dag-id, task-id, execution-date} tuple, it might not be a problem. Composer co-hosts the Airflow API server with the webUI, here's an example on how to programmatically access the Airflow APIs.
Has anyone come up with a complete solution to protect and replicate VMs from on-prem (either VMware or HyperV) to Azure using either the REST API or the Powershell module?
I recently completed a POC with ASR and was able to replicate a couple dozen VMs associated with three different applications. I replicated out of VMware and into Azure. I was able to failover and failback successfully.
I did all of the POC work using the GUI (portal.azure.com). Now I have to figure out how to protect ~2000 VMs and there is no way that I am going to do that with the GUI. But the MS documentation has me running in circles.
(https://learn.microsoft.com/en-us/azure/site-recovery/)
It would be very helpful if any of you can share the sequence of steps to protect and replicate a VM. The MS documentation does not lay out how the various components (fabrics, protection policies, protection containers, protection items, etc.) are related to each other.
I do not need specific syntax. The documentation does a passable job of detailing the syntax. I could use some guidance on the task sequence.
If it helps to understand the bigger picture, my intention is to use a System Center Orchestrator runbook to ingest a CSV list of VMs, parse that out into input for the Azure REST API / Powershell, and then enable protection.
Thanks in advance for any assistance or guidance that you are able to provide.
You can find recovery service API documentation here:
https://learn.microsoft.com/en-us/rest/api/recoveryservices/
When you have one definition put in place(manually from portal), you may also be able to study it from resource.azure.com to see how properties are composited. *not all resource available thru this portal
After that, you should be able to create template for either REST call or Resource Manager, depending on preference.