503service unavailable in Salesforce - rest

My goal is to create a REST API Integration from Salesforce to SAP application.
SUCCESS Through Chrome APP
1. All I need to do is retrieve values from sap application through the REST API. When I tried to use the Chrome APP 'Advanced Rest Client' and have passed the appropriate URL and Content with POST method I was able to retrieve the values from local server database.
For EG : If I pass request 92126 then I was able to get response  'SAN DIEGO' which is correct.
Here is the link (https://chrome.google.com/webstore/detail/advanced-rest-client/hgmloofddffdnphfgcellkdfbfbjeloo?hl=en-US) for Advanced REST Client.
PROBLEM from Salesforce :
I had created a remotesite setting 
When I created this REST class in SAlesforce and tried invoking the End Point then it's throwing this error.
System.HttpResponse[Status=Service Unavailable, StatusCode=503]
As the web api url which is provided to us is in local sql server i.e hosted in private, as we know in Salesforce for making callouts the URLs must be in public. But the URL is in private only for the security reasons not hosted in public. We should achieve it, any way is there to achieve it? What change should be done in Salesforce or server to communicate to each other, and allows to make the callout?

It is most likely that you endpoint does not allow access from outside some ip range which you indicated by saying it's not public. Salesforce is a SaaS application hosted outside the domain that your service is on. In order for Salesforce to access that endpoint resource you need to whitelist Salesforce IP ranges, which can be found here.
Whitelisting allows Salesforce to access the resource. The only caveat is that because Salesforce is multi-tenant it means that any instance of Salesforce on the range that you whitelist would have access to your endpoint. If this is not ok, you might want to add some sort of header or sign the request to the call to that identifies your Salesforce instance uniquely from any other instance to validate that the call originated from your Salesforce org.
(I am linking to the article instead of pasting the IP ranges here because these may change in the future).

Related

Using a Web activity along with a linked service to call a rest api

I have to send data to a rest API via data factory.
I get batches of records from a database, send them in the body of the REST API call, which provides a response that informs the action performed on each record.
I created a linked service to the base API URL, and this linked service does the authentication to the API.
My question is how I use this linked service along with a web activity in a pipeline?
The web activity requires me to enter a full URL, which feels redundant as the base URL is already in the linked service.
The web activity does let me add multiple linked services but I'm unsure why it allows multiple linked services and how this is supposed to work.
I would appreciate expertise regarding how the web activity works with a linked service.
Thanks!

Transfer SCP Cloud User Information securely to ABAP

I have a UI5 Application hosted on SAP NEO which retrieves Data via an ABAP OData Service.
Users are maintained in SAP Cloud Identity and mapped with their Backenduser and the login is setup via Principal Propagation. This is all setup and works.
I got requested to change the Application to enable External Sales Representatives without SAP Backend Users to use the application.
The idea is to use one "technical user" with Basic Authentication instead of Principal Propagation.
My question is what would be the way to identify the original Cloud Username in ABAP(since there sy-uname would be a technical user).
Debugging in ABAP didn't reveal the original information and I am afraid the original User is not even passed to the Gateway
The SAP Cloud User API (https://help.sap.com/viewer/65de2977205c403bbc107264b8eccf4b/Cloud/en-US/1de599bf722446849d2b2e10132df42a.html) is not an option because the request could be manipulated in the browser
I heard of another option using a Java Servlet. But I am afraid that means we have to setup there the whole OData Service again and with every change in the Gateway we have to adjust the Java Servlet as well, or is there maybe a proxy.
If you are using Mobile Services of SAP Cloud Platform, you can activate a header with the username to be transferred to your ABAP system. It's called X-SMP-ENDUSERNAME.
Ref the documentation at https://help.sap.com/viewer/38dbd9fbb49240f3b4d954e92335e670/Cloud/en-US/defdadb71ee2476691d987689e3703a2.html
I assume you can get cloud user ID within your UI5 application and in case you access backend via Odata model you can use ODataModel.setHeaders function to provide your custom request headers which will be attached to every request sent to the backend. I would try to send cloud user id in some custom header value.
And on ABAP side you can use DP facade interface in service implementation to read custom headers:
lo_facade ?= /iwbep/if_mgw_conv_srv_runtime~get_dp_facade( ).
lt_client_headers = lo_facade->get_request_header( ).

Whether i can hit non public URL from Dialog flow through web hook?

I want to use Dialogflow for my enterprise usage. So want to know whether Dialog flow will be able to hit Non public URLs?
Since Dialogflow is a service hosted by Google, fulfillment requests specified by Webhook URLs must be able to be reached by Dialogflow for them to be invoked. In addition, the webhook endpoints must expose themselves using SSL/TLS and must be associated with a non-self-signed certificate. When a request is made from Dialogflow, dialogflow can provide authentication credentials to ensure that it is indeed Dialogflow that is making the request.
One pattern for your usage is to expose the Webhooks to the Internet and only allow connections from the Google IP address range and also require authentication (known only to Dialogflow). This would go a long way in preventing malicious access to your Webhook.
An alternative would be to define your Webhook as a GCP hosted endpoint and then you would own the routing back to your internal system from there. That could use a variety of technologies beyond HTTP including Pub/Sub. For example, when Dialogflow invokes the Webhook, a GCP application could be called that posts a message to PubSub. Your Enterprise application could be a subscriber and be notified that it has work to do. It does work and responds with a new message which is received by your GCP hosted Webhook that then returns the response to Dialogflow. As such, there is no surface area for an attacker to try and penetrate.

Integrating back end to front end

Our organization has a data collection on their servers. A soap API has been implemented and the data can be accessed using the WSDL on SOAP UI. I am a front-end developer and when I make a POST request using XMLHttpRequest to get the query result, it throws CORS error: "Response to the preflight request doesn't pass access control". It is NOT possible to enable CORS on the data collection servers. I am using Liferay for the website front end and the back end.
Any suggestions how I can get the query results from the front end without enabling CORS on database servers(this is different than the Liferay backend server)? Or I can use a website backend to interact with the database? Or use third-party services like Kinvey?
I have had similar issues in the past. Like you, I wanted to create a basic webpage on my machine and that contained some Javascript to call an API. With this approach, I got the CORS issue you are seeing.
I then hosted my page on a web-server and I still got the CORS issue.
To resolve, I had to create a web app, which I wrote in Java. This back-end contained its own API. One of the resources in 'my' API was a simple wrapper to call the API of interest. I then modified the webpage I wrote (now all hosted in the same web app), to call my API, which in turn calls the API of interest.

Basic Auth for EnvelopeDefinition:EventNotification API types

I am using the DocuSign REST API to send documents and we are trying to take advantage of the webhook capabilities.
Our problem is that we have basic auth set up on our webhook listener which it doesn't appear is supported in the REST API. I am enquiring whether or not this is an accurate understanding. I know it is possible on the Connect product, which the basic auth can be set up via the portal, however I am looking specifically for API REST calls.
How can I secure the EventNotification event?
Many thanks,
Mitch.
As of now, according to docusign offical post:
Individual Envelope Connect configurations created with the eventNotifications API option do not support Basic Authentication at this time.
A WORKAROUND SOLUTION:
Docusign has an official blog post to answer this question. Securing Your Connect Webhook Listener
For us, we use the Use a Pre-shared Secret method to secure our webhook. We are using the python sdk. During the process we create the EventNotification, we add a secret parameter to the webhook listener url. When the docusign post the request to your listener server, it will include the secret. You could check the parameter to make sure the request is valid.
event_notification = EventNotification()
event_notification.url ='your webhook_url' + '?secret=' + 'your_secret')
The text below is quoted from the blog post.
This defense acts both as access control and authentication. The
listener URL you provide to DocuSign can include one or more query
parameters. DocuSign will include them during its POST request to your
listener.
For access control, your listener will first check that the request
includes the expected query parameter and reject all requests that
don’t. For authentication, your listener will additionally check the
value of the query parameter. Remember that you can encode any values
for the name and value of the query parameter. For this example, we’re
using “pw” as the name of the query parameter.
To use a pre-shared secret, just set the URL accordingly in the
Connect configuration. For example,
https://listener_url.example.com/listener?pw=secret
Remember that the complete URL, including its query parameters, is
encrypted before it is sent across the internet. The URL and its query
parameters are visible in various logs and configuration screens,
including the Connect webhook configuration page.