First of all I want you to know that I am new to Azure.
Recently, I am trying to work on Azure Logic App.
My motive is to make a simple REST API Call from the HTTP API (from Microsoft) and mail the response JSON via Office 365 connector.
Here is my code:
{
..
.
.
"triggers": {
"http": {
"recurrence": {
"frequency": "Day",
"interval": 1
},
"type": "Http",
"inputs": {
"method": "POST",
"headers": {
"Content-Type": "application/json"
},
"uri": "http://xxx/wcf/myrestservice.svc/is_online"
}
}
},
"actions": {
"office365connector": {
"type": "ApiApp",
"inputs": {
"apiVersion": "2015-01-14",
"host": {
"id": "/subscriptions/xxx/resourcegroups/resourcegroup1/providers/Microsoft.AppService/apiapps/office365connector",
"gateway": "https://xxx.azurewebsites.net"
},
"operation": "SendMail",
"parameters": {
"message": {
"To": "xxx#example.com",
"Subject": "My Service Status",
"Importance": "High",
"Body": "Hi #{triggers().outputs.body.Is_OnlineResult}"
}
},
"authentication": {
"type": "Raw",
"scheme": "Zumo",
"parameter": "#parameters('/subscriptions/xxx/resourcegroups/resourcegroup1/providers/Microsoft.AppService/apiapps/office365connector/token')"
}
},
"conditions": []
}
},
"outputs": {}
}
I am wondering, how could I get the response of the HTTP call?
Then I want to send the same in the mail body.
Please correct me if I am going in wrong direction. Any response from you will be very helpful to me.
Manish!
Have you tried using "Content"?
#triggers().outputs.body.Content
Related
I need some beginner help to KrakenD. I am running it on Ubuntu. The config is provided below.
I am able to reach the /healthz API without problem.
My challenge is that the /hello path returns error 500. I want this path to redirect to a Quarkus app that runs at http://getting-started36-getting-going.apps.bamboutos.hostname.us/.
Why is this not working? If I modify the /hello backend and use a fake host, I get the exacts ame result. This suggests that KrakendD is not even trying to connect to the backend.
In logs, KrakendD is saying:
Error #01: invalid character 'H' looking for beginning of value
kraken.json:
{
"version": 2,
"port": 9080,
"extra_config": {
"github_com/devopsfaith/krakend-gologging": {
"level": "DEBUG",
"prefix": "[KRAKEND]",
"syslog": false,
"stdout": true,
"format": "default"
}
},
"timeout": "3000ms",
"cache_ttl": "300s",
"output_encoding": "json",
"name": "KrakenD API Gateway Service",
"endpoints": [
{
"endpoint": "/healthz",
"extra_config": {
"github.com/devopsfaith/krakend/proxy": {
"static": {
"data": { "status": "OK"},
"strategy": "always"
}
}
},
"backend": [
{
"url_pattern": "/",
"host": ["http://fake-backend"]
}
]
},
{
"endpoint": "/hello",
"extra_config": {},
"backend": [
{
"url_pattern": "/hello",
"method": "GET",
"host": [
"http://getting-started36-getting-going.apps.bamboutos.hostname.us/"
]
}
]
}
]
}
What am I missing?
add "encoding": "string" to the backend section.
"backend": [
{
"url_pattern": "/hello",
"method": "GET",
"encoding": "string" ,
"host": [
"http://getting-started36-getting-going.apps.bamboutos.hostname.us/"
]
}
]
I have k8s pod running 3 containers: my app, opa, envoy
All my setup follow this guide: https://www.openpolicyagent.org/docs/latest/envoy-authorization/
Everything went well until I have 15kb JSON body.
Checking the OPA container log I see in request.http.body - only about half of JSON there.
{
"decision_id": "",
"error": {},
"input": {
"attributes": {
"destination": {
"address": {
"Address": {
"SocketAddress": {
"PortSpecifier": {
"PortValue": 8000
},
"address": "10.244.8.102"
}
}
}
},
"request": {
"http": {
"body": "only half of JSON body come here",
"headers": {
":authority": "api-service.com",
":method": "PUT",
":path": "/api",
"accept": "application/json",
"content-length": "14822",
"content-type": "application/json",
"x-envoy-decorator-operation": "....",
"x-envoy-internal": "true",
"x-forwarded-for": "10.244.6.0",
"x-forwarded-proto": "https",
"x-istio-attributes": "..."
},
"host": "....com",
"id": "12114967460600931537",
"method": "PUT",
"path": "/api",
"size": 14822
}
},
"source": {
"address": {
"Address": {
"SocketAddress": {
"PortSpecifier": {
"PortValue": 34670
},
"address": "10.244.3.164"
}
}
}
}
},
"parsed_path": [
"api"
],
"parsed_query": {}
},
"level": "info",
"msg": "Decision Log",
"query": "data.app.allow",
"type": "openpolicyagent.org/decision_logs"
}
I tried increase with_request_body.
http_filters:
- name: envoy.ext_authz
config:
with_request_body:
max_request_bytes: 819200
allow_partial_message: true
failure_mode_allow: false
Is there any other thing I missed?
Thanks a lot for your help
Are there any errors in the Envoy logs ?
What is the data that you are trying to send ? Does it need to be part of OPA's input document or can you leverage OPA's bundle feature.
I finally make it works by increasing max_request_bytes.
name: envoy.ext_authz
config:
with_request_body:
max_request_bytes: 819200
I configured this before in configmap but forgot to restart the pod. Just redeploy everything with new max_request_bytes - it's ok now
Reference: https://www.envoyproxy.io/docs/envoy/latest/api-v3/extensions/filters/http/buffer/v3/buffer.proto.html?highlight=max_request_bytes
Thank you all
I am facing the below issue in creating an Azure Machine Learning Batch Execution activity to execute a scoring ML experiment. Please help:
Please let me know if any other relevant information is needed. I am new to this so, please help
Created an AzureML Linked Service as below:
{
"name": "PredictionAzureML",
"properties": {
"typeProperties": {
"mlEndpoint": "https://ussouthcentral.services.azureml.net/workspaces/xxxxx/jobs",
"apiKey": "xxxxxxxx=="
},
"type": "AzureML"
}
}
Created Pipeline as below:
{
"name": "pipeline1",
"properties": {
"description": "use AzureML model",
"activities": [
{
"name": "MLActivity",
"description": "description",
"type": "AzureMLBatchExecution",
"policy": {
"timeout": "02:00:00",
"retry": 1,
"retryIntervalInSeconds": 30
},
"typeProperties": {
"webServiceInput": "PredictionInputDataset",
"webServiceOutputs": {
"output1": "PredictionOutputDataset"
}
},
"inputs": [
{
"name": "PredictionInputDataset"
}
],
"outputs": [
{
"name": "PredictionOutputDataset"
}
],
"linkedServiceName": "PredictionAzureML"
}
]
}
}
Getting the below error:
{
"errorCode": "2109",
"message": "'linkedservicereference' with reference name 'PredictionAzureML' can't be found.",
"failureType": "UserError",
"target": "MLActivity"
}
I got this working in Data Factory v2, so apologies if you are using v1.
Try putting the linkedServiceName as an object in the JSON outside of the typeProperties and use the following structure:
"linkedServiceName": {
"referenceName": "PredictionAzureML",
"type": "LinkedServiceReference"
}
Hope that helps!
Please use "Trigger" instead of "Debug" in the UX. You need publish your pipeline first before click "Trigger" Button.
Please follow this doc to update your payload. It should look like the following.
{
"name": "AzureMLExecutionActivityTemplate",
"description": "description",
"type": "AzureMLBatchExecution",
"linkedServiceName": {
"referenceName": "AzureMLLinkedService",
"type": "LinkedServiceReference"
},
"typeProperties": {
"webServiceInputs": {
"<web service input name 1>": {
"LinkedServiceName":{
"referenceName": "AzureStorageLinkedService1",
"type": "LinkedServiceReference"
},
"FilePath":"path1"
},
"<web service input name 2>": {
"LinkedServiceName":{
"referenceName": "AzureStorageLinkedService1",
"type": "LinkedServiceReference"
},
"FilePath":"path2"
}
},
"webServiceOutputs": {
"<web service output name 1>": {
"LinkedServiceName":{
"referenceName": "AzureStorageLinkedService2",
"type": "LinkedServiceReference"
},
"FilePath":"path3"
},
"<web service output name 2>": {
"LinkedServiceName":{
"referenceName": "AzureStorageLinkedService2",
"type": "LinkedServiceReference"
},
"FilePath":"path4"
}
},
"globalParameters": {
"<Parameter 1 Name>": "<parameter value>",
"<parameter 2 name>": "<parameter 2 value>"
}
}
}
I've deployed the hyperledger-fabric service on Bluemix and obtained the credentials from there, one line looks like this:
{"enrollId":"user_type1_0","enrollSecret":"XXXXX","group":"group1","affiliation":"0001","username":"user_type1_0","secret":"XXXXX"}
I post the following to the "registrar" REST endpoint:
Secret: { "enrollId": "user_type1_0", "enrollSecret": "xxxxx" }
I get this response:
{ "OK": "Login successful for user 'user_type1_0'." }
Then I try to register some chaincode using POSTing the following to the chaincode REST endpoint:
QuerySpec {
"jsonrpc": "2.0",
"method": "deploy",
"params": {
"type": 1,
"chaincodeID": {
"path": "https://github.com/ibm-blockchain/learn-chaincode/finished"
},
"ctorMsg": {
"function": "init",
"args": [
"hi there"
]
},
"secureContext": "user_type1_0_xxxxx"
},
"id": 1 }
I get this reponse:
{ "jsonrpc": "2.0", "error": {
"code": -32000,
"message": "Registration missing",
"data": "User not logged in. Use the '/registrar' endpoint to obtain a security token." }, "id": 1 }
Any idea?
Fabric expects that you will provide EnrolmentID as a security context but you are trying to use "ID+Pass".
Can you try to run your deploy command with another SecurityContext value ?
QuerySpec { "jsonrpc": "2.0", "method": "deploy", "params": { "type": 1, "chaincodeID": { "path": "https://github.com/ibm-blockchain/learn-chaincode/finished" }, "ctorMsg": { "function": "init", "args": [ "hi there" ] }, "secureContext": "user_type1_0" }, "id": 1 }
I've setup a resource in API Gateway with an API key and a Mock Integration. When I test in the console, I can see the canned JSON response which I setup in the integraion response.
However, when I test externally using Postman, I can see the expected status code in the response (201) but the response body is empty.
Would anyone be able to shed some light over why this might be?
Many thanks
Ben
Just to make sure, did you deploy the latest version of your API? I used the sample API below which worked fine for me. Could you try this one?
{
"swagger": "2.0",
"info": {
"version": "2016-04-19T19:54:16Z",
"title": "my-mock-api"
},
"basePath": "/prod",
"schemes": [
"https"
],
"paths": {
"/": {
"get": {
"consumes": [
"application/json"
],
"produces": [
"application/json"
],
"responses": {
"200": {
"description": "200 response",
"schema": {
"$ref": "#/definitions/Empty"
}
}
},
"x-amazon-apigateway-integration": {
"responses": {
"default": {
"statusCode": "200",
"responseTemplates": {
"application/json": "{\"test\":\"test\"}"
}
}
},
"requestTemplates": {
"application/json": "{\"statusCode\": 200}"
},
"type": "mock"
}
}
}
},
"definitions": {
"Empty": {
"type": "object"
}
}
}