How to integrate AWS API Gateway with FIFO SQS?
It is easy to integrate API Gateway with standard SQS with the template. But for FIFO queue, we need to pass groupId and deduplicationId.
You can extract the MessageGroupId and MessageDeduplicationId from the response of API gateway. eg: if the payload is something like below, you can extract the any properties from the payload. Clear video tutorial here. https://www.youtube.com/watch?v=dXa9KA-G9Dg
Assume the payload is like this:
{
"data" :{
"jobNumber": "123456"
}
}
Then the template in api gateway is below. It extract the jobNumber from the payload and set to MessageGroupId. Here the MessageDeduplicationId is getting from the context.
#set($dedupId = $context.requestId)
#set($groupId = $input.json('$.data.jobNumber'))
Action=SendMessage&MessageBody=$input.body&MessageGroupId=$groupId&MessageDeduplicationId=$dedupId
The original post: Unable to send message from API Gateway to FIFO SQS
Related
When using a child object (relatedInvoices in the below example) in the REST API call in a Service connector, I get an error "Unable to parse the payload". The same API call in postman is successful, but not in IICS. Pls let me know if there is any fix for this or is there any other way we can approach this -
Method: POST
Request body ex:
{{
"PaymentNumber":"{$PaymentNumber}",
"PaymentDocument":"{$PaymentDocument}",
..
"relatedInvoices":
{
"invoiceNumber":"{$invoiceNumber}",
'invoiceAmount":"{$invoiceAmount}",
..
}
}}
I am trying to explore different options to connect to a REST endpoint using Azure Data Factory. I have the below python code which does what I am looking for but not sure if Azure Data Factory offers something out of the box to connect to the api or a way to call a custom code.
Code:
import sys
import requests
from requests_oauthlib import OAuth2Session
from oauthlib.oauth2 import BackendApplicationClient
import json
import logging
import time
logging.captureWarnings(True)
api_url = "https://webapi.com/api/v1/data"
client_id = 'client'
client_secret = 'secret'
client = BackendApplicationClient(client_id=client_id)
oauth = OAuth2Session(client=client)
token = oauth.fetch_token(token_url='https://webapi.com/connect/accesstoken', client_id=client_id, client_secret=client_secret)
client = OAuth2Session(client_id, token=token)
response = client.get(api_url)
data = response.json()
When I look at the REST linked service I don't see many authentication options
Could you please point to me on what activities to use to make OAuth2 working in Azure Data Factory
You would have to use a WebActivity to call using POST method and get the authentication token before getting data from API.
Here is an example.
First create an Web Activity.
Select your URL that would do the authentication and get the token.
Set Method to POST.
Create header > Name: Content-Type Value: application/x-www-form-urlencoded
Configure request body for HTTP request.
..
Format: grant_type=refresh_token&client_id={client_id}&client_secret=t0_0CxxxxxxxxOKyT8gWva3GPU0JxYhsQ-S1XfAIYaEYrpB&refresh_token={refresh_token}
Example: grant_type=refresh_token&client_id=HsdO3t5xxxxxxxxx0VBsbGYb&client_secret=t0_0CqU8oA5snIOKyT8gWxxxxxxxxxYhsQ-S1XfAIYaEYrpB&refresh_token={refresh_token
I have shown above for example, please replace with respective id and secret when you try.
As an output from this WebActivity, you would receive a JSON string. From which you can extract the access_token to further use in any request header from further activities (REST linked service) in the pipeline depending on your need.
You can get the access_token like below. I have assigned it to a variable for simplicity.
#activity('GetOauth2 token').output.access_token
Here is an example from official MS doc for Oauth authentication implementation for copying data.
I need to make a call to a rest API from databricks preferably using Scala to get the data and persist the same in databricks. This is the first time i am doing this and I need help. Can any of you please walk me through step by step as to how to achieve this?. The API team has already created a service principal and has given access to the API. So the authentication needs to be done through SPN.
Thanks!
REST API is not recommended approach to ingest data into databricks.
Reason: The amount of data uploaded by single API call cannot exceed 1MB.
To upload a file that is larger than 1MB to DBFS, use the streaming API, which is a combination of create, addBlock, and close.
Here is an example of how to perform this action using Python.
import json
import base64
import requests
DOMAIN = '<databricks-instance>'
TOKEN = b'<your-token>'
BASE_URL = 'https://%s/api/2.0/dbfs/' % (DOMAIN)
def dbfs_rpc(action, body):
""" A helper function to make the DBFS API request, request/response is encoded/decoded as JSON """
response = requests.post(
BASE_URL + action,
headers={"Authorization: Bearer %s" % TOKEN },
json=body
)
return response.json()
# Create a handle that will be used to add blocks
handle = dbfs_rpc("create", {"path": "/temp/upload_large_file", "overwrite": "true"})['handle']
with open('/a/local/file') as f:
while True:
# A block can be at most 1MB
block = f.read(1 << 20)
if not block:
break
data = base64.standard_b64encode(block)
dbfs_rpc("add-block", {"handle": handle, "data": data})
# close the handle to finish uploading
dbfs_rpc("close", {"handle": handle})
For more details, refer "DBFS API"
Hope this helps.
the above code will work, in case if you want to upload jar file or non-ascii file instead of
dbfs_rpc("add-block", {"handle": handle, "data": data})
use
dbfs_rpc("add-block", {"handle": handle, "data": data.decode('UTF8')})
rest of the details are same.
We want to use API blueprint together with a schema. Let's say, we want to specify that PUT to a resource accepts an Account in the payload and GET on the same resource returns an Account payload. So I need to specify that Account is used in GET and PUT and I need to specify the Account itself. I do not know where to specify it, what's the canonical way? Unfortunately I was not able to find it in the examples.
Reusing one message payload in multiple action is where can utilize the concept of a resource model.
Simply define a account model and then reuse it later like so:
# Account [/account]
+ Model (application/json)
+ Body
{ ... }
+ Schema
{ ... }
## Retrieve an Account [GET]
+ Response 200
[Account][]
## Update an Account [PUT]
+ Request
[Account][]
+ Response 204
We are currently using ServiceStack for our web api which is 99% REST/JSON however we have one new message that we need to allow a SOAP endpoint for. (The client is an older BizTalk server and SOAP 12 is required.) We want to prevent SOAP on everything else but this one message. Is there a way to enable the SOAP Feature on a single message or service? What are the scopes to which SOAP can be specified?
In our AppHost Configure() we have removed the Features we want to disallow with EnableFeatures = Feature.All.Remove( Feature.Csv | Feature.Html | Feature.Jsv | Feature.Soap ) however this removes our ability to expose the SOAP12 WSDL. On the flip-side, not removing Feature.Soap enables SOAP for all POST messages which we need to prevent.
Any help is appreciated.
I have found the answer to my question.
ServiceStack has provided a way to specify what endpoint channels are available and to whom using the ServiceStack.ServiceHost.RestrictAttribute (https://github.com/ServiceStack/ServiceStack/wiki/Security). In order to limit the visibility of my message to only SOAP12:
//Request DTO
[DataContract(Namespace = Namespaces.Messages.NS_2013_01 )]
[Restrict(EndpointAttributes.Soap12 )]
public class Hello
{
public string Name { get; set; }
}
The visibility in my metadata is restricted appropriately: