I wants to know the status of all the task executed. We can do that manually, but i wants to know it through REST Client(Google composer).
Composer environment API currently doesn't support retrieving Airflow-level metadata info. As a workaround, you may want to use Airflow task infoAPI. Admittedly the Airflow API isn't comprehensive, but if you know the {dag-id, task-id, execution-date} tuple, it might not be a problem. Composer co-hosts the Airflow API server with the webUI, here's an example on how to programmatically access the Airflow APIs.
Related
I am using a Logic App for which I need to create a custom connector. This connector depends on a web service, for which I am trying to add using wsdl definition.
Now If I provide the url, it needs authentication, which I am not able to provide via this UI. I can see the parameters can be provided while using it in the logic app. However it fails to pull the services and hence not creating the definition for the connector
I tried downloading the wsdl and adding here as a file, but the schema have xs import tags, because of which its failing again. And as per this answer, I can not replace it with actual schema.
<xs:import namespace="http://some.name/" schemaLocation="./path/to/it.xsd"/>
Is there a way that I do not need to provide the custom connector definition manually and make it work using wsdl, as it contains a lot of endpoints and it would be too much to add all actions and triggers manually. Plus it would be also reference for me if needed in future for such scenario
You may try this if the services are accessible over the internet, then you call service endpoint over HTTP or HTTPS from azure logic apps. This article will help you with details steps to be followed: https://learn.microsoft.com/en-us/azure/connectors/connectors-native-http
If it is not accessible over the internet then this article will help with step by step process: https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-gateway-connection
Before you can access data sources on premises from your logic apps, you need to create an Azure resource after you install the on-premises data gateway on a local computer. Your logic apps then use this Azure gateway resource in the triggers and actions provided by the on-premises connectors that are available for Azure Logic Apps.
Also check this
I would like to be able to call Powershell scripts using a REST API. (Please note that I am describing the _opposite_ of calling a REST API from Powershell.) Are there any prebuilt API gateways that support this use case? I've looked at Ocelot, but it currently only acts as a gateway to other REST APIs. Ideally I would simply design my Powershell script functions to follow a defined interface pattern, put the files into a defined directory, and the API gateway would either immediately make those functions available as REST API calls or with minimal configuration.
EDIT: To clarify, I am looking for something self hosted, not cloud based. I haven't found anything yet that is exactly what I need, I may create something myself.
You can try AWS Lambda and API gateway integration.
Here is an example: https://aws.amazon.com/blogs/developer/creating-a-powershell-rest-api/
Amazon offer 12 month free tier plan for this.
A couple of options. If you are on Azure you could expose your Powershell Scripts through Azure Automation :
https://learn.microsoft.com/en-us/azure/automation/automation-webhooks
That'd be a lightweight way of having your scripts enabled through a HTTP POST scenario.
You could also combine or mix it with adding API Management in front to support various scenarios (adding GET/PUT/DELETE support e.g.) or even automate or proxy more things. API Management could of course also be automated.
https://azure.microsoft.com/en-us/services/api-management/
You could also create a folder structure with modules & sub-functions and create a full REST API by using Azure Functions with PowerShell:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-reference-powershell
The latter would also be able to execute in containers & in the supported Azure Function
runtimes.
Bluemix availability monitoring provides scripting support for Selenium only. Is there a way, I can have my shell or bash script that does following
- Builds a url
- Calls url using curl
- Process the response
My current urls are protected by Bluemix IAM. To call url, I need to pass access token in the header. The access token expires every hour which makes it impossible to use Bluemix availability monitoring service.
Currently there is only selenium script support.
There are plans for javascript (not shell) for REST APIs, but there are no firm dates when that might be available.
IAM tokens that expire every hour will make it difficult to monitor regardless of choice of monitoring mechanism. I am not familiar enough with IAM to provide much advice. Perhaps a second process outside the monitoring can refresh the accessToken?
Some who use Bluemix Availability Monitoring use APIConnect. Perhaps those tokens are less ephemeral, but I'm not sure how to map APIConnect access to IAM.
Is it possible to execute WF Workflows in REST API/Web API. I am planning to use "WorkflowApplication" object to run a workflow. For each request to start a WF, a separate instance of the "WorkflowApplication" object will be used to run a workflow. I haven't seen any articles which says I can use WorkFlows in REST API. Do you see any issues in doing this?. If yes, what alternatives I have?
What is the best way to create a new Windows Azure Hosted service from a running role using a package and configuration that I have stored in blob storage?
I am guessing that I could use a Service Management REST API Create Deployment request, however running a cmdlet from my worker role might be better. Any thoughts? If the cmdlet route is better, bonus points if you can point me in the right direction on how to run them from a worker role.
Not sure what is 'best' here because it depends on what you are trying to do. If you just need to create a hosted service programmatically it would be about the same to create a REST client, upload a cert, and just do it versus using the cmdlets or anything else.
As the creator of the cmdlets, they have a special place in my heart, but I would probably stick to using those for IT admin tasks. They rock for cmd line automation.
That being said, it is not terribly hard to roll your own client, but I typically recommend that you download the Service Managements contracts from csmanage. That way, you have a simple wrapper around this to get going. While it does use WCF, it is not too onerous.