Using the AWS API Gateway, I would like to build an API, which can only handle a POST request, receives an url (i.e. https://something.execute-api.eu-central-1.amazonaws.com/apidomain?url=google.com), forwards the url to an integromat webhook and logs a "success"-json to the user.
Any advice on how to start without building complicated Lambda functions?
Related
I have to send data to a rest API via data factory.
I get batches of records from a database, send them in the body of the REST API call, which provides a response that informs the action performed on each record.
I created a linked service to the base API URL, and this linked service does the authentication to the API.
My question is how I use this linked service along with a web activity in a pipeline?
The web activity requires me to enter a full URL, which feels redundant as the base URL is already in the linked service.
The web activity does let me add multiple linked services but I'm unsure why it allows multiple linked services and how this is supposed to work.
I would appreciate expertise regarding how the web activity works with a linked service.
Thanks!
I'm working with a service that will forward data to a URL of your choosing via HTTP POST requests.
Is there a simple way to publish to a Pubsub topic with a POST? The service I'm using (Hologram.io's Advanced Webhook Builder) can't store any files, so I can't upload a Google Cloud service account JSON key file.
Thanks,
Ryan
You have 2 challenges in your use cases:
Format
Authentication
Format
You need to customize the webhook to comply with the PubSub format. Some webhoock are enough customizable for that but it's not the case of all. If you can't customize the webhook call as PubSub expect, you need to use an intermediary layer (Cloud Functions or Cloud Run for example)
Authentication
Directly to PubSub or with an intermediary layer, the situation is the same: the requester (the webhook) needs to be authenticated and authorized to access to the Google Cloud service.
One of the bad, and possible, practice, is to set allUsers authorized to access your resources. Here an example with a PubSub topic
Don't do that. Even if you increase "your" process security by defining a schema (and thus to reject all the messages that aren't compliant with this schema), letting a resource publicly, and without authentication, accessible on the wild internet is criminal!
In the webhook context (I had this case previously in my company) I recommend you to use a static authentication (a long lived authentication header; not a short lived (1h) as a Google OAuth2 token); an API Key for example. It's not perfect, because in case of API Key leak, the bad actors will be able to use this breach for a long time (rotate as soon as you can your API Keys!), but it's safer than nothing!
I wrote a pretty old article on this use case (with ESPv2 and Cloud Run), but the principle, and the configuration, is almost the same on API Gateway, a Google Cloud manage services. In the article, I create a proxy for Cloud Run, Cloud Functions and App Engine, but you can do the same thing with PubSub by setting the correct target URL.
I want to use Dialogflow for my enterprise usage. So want to know whether Dialog flow will be able to hit Non public URLs?
Since Dialogflow is a service hosted by Google, fulfillment requests specified by Webhook URLs must be able to be reached by Dialogflow for them to be invoked. In addition, the webhook endpoints must expose themselves using SSL/TLS and must be associated with a non-self-signed certificate. When a request is made from Dialogflow, dialogflow can provide authentication credentials to ensure that it is indeed Dialogflow that is making the request.
One pattern for your usage is to expose the Webhooks to the Internet and only allow connections from the Google IP address range and also require authentication (known only to Dialogflow). This would go a long way in preventing malicious access to your Webhook.
An alternative would be to define your Webhook as a GCP hosted endpoint and then you would own the routing back to your internal system from there. That could use a variety of technologies beyond HTTP including Pub/Sub. For example, when Dialogflow invokes the Webhook, a GCP application could be called that posts a message to PubSub. Your Enterprise application could be a subscriber and be notified that it has work to do. It does work and responds with a new message which is received by your GCP hosted Webhook that then returns the response to Dialogflow. As such, there is no surface area for an attacker to try and penetrate.
My goal is to create a REST API Integration from Salesforce to SAP application.
SUCCESS Through Chrome APP
1. All I need to do is retrieve values from sap application through the REST API. When I tried to use the Chrome APP 'Advanced Rest Client' and have passed the appropriate URL and Content with POST method I was able to retrieve the values from local server database.
For EG : If I pass request 92126 then I was able to get response 'SAN DIEGO' which is correct.
Here is the link (https://chrome.google.com/webstore/detail/advanced-rest-client/hgmloofddffdnphfgcellkdfbfbjeloo?hl=en-US) for Advanced REST Client.
PROBLEM from Salesforce :
I had created a remotesite setting
When I created this REST class in SAlesforce and tried invoking the End Point then it's throwing this error.
System.HttpResponse[Status=Service Unavailable, StatusCode=503]
As the web api url which is provided to us is in local sql server i.e hosted in private, as we know in Salesforce for making callouts the URLs must be in public. But the URL is in private only for the security reasons not hosted in public. We should achieve it, any way is there to achieve it? What change should be done in Salesforce or server to communicate to each other, and allows to make the callout?
It is most likely that you endpoint does not allow access from outside some ip range which you indicated by saying it's not public. Salesforce is a SaaS application hosted outside the domain that your service is on. In order for Salesforce to access that endpoint resource you need to whitelist Salesforce IP ranges, which can be found here.
Whitelisting allows Salesforce to access the resource. The only caveat is that because Salesforce is multi-tenant it means that any instance of Salesforce on the range that you whitelist would have access to your endpoint. If this is not ok, you might want to add some sort of header or sign the request to the call to that identifies your Salesforce instance uniquely from any other instance to validate that the call originated from your Salesforce org.
(I am linking to the article instead of pasting the IP ranges here because these may change in the future).
I'm trying to access some GET requests data from a personal ngrok webpage (that i created for testing purposes) by using the aws api gateway.
When i'm using the integration request i just can't get the data from the ngrok web page.
What i am trying to do is to catch some data from my ngrok page (by using the Api gateway) and then save them in dynamodb via a lambda function. I read the aws docs but i can find something that explains this process.
Thanks for helping.
One way is, you can have one API GW method that uses a Lambda function as integration endpoint. The Lambda function could get the data from ngrok page and save it to Dynamo.
The other way is, you can have one API GW method with http integration pointing to your ngrok endpoint, and another API GW method with Lambda integration pointing to your Lambda function that puts the data into Dynamo. The caller should call the first method, and then call the second method with the output of the first method. So, the first option is a better one.