Handle multiple guzzle request in proxy for REST API (local server crashes) - rest

I have the following case: I have a REST API, that can only be accessed with credentials. I need the frontend to make requests directly to the API to get the data. Because I don't want to hide the credentials somewhere in the frontend, I set up a proxy server, which forwards my request with http://docs.guzzlephp.org/en/stable/index.html but adds the necessary authentication.
No that worked neatly for some time, but now I added a new view where I need to fetch from one more endpoint. (so far it was 3 requests locally (MAMP))
Whenever I add a fourth API request, which all are being executed right on page load, my local server crashes.
I assume it is linked to this topic here:
Guzzle async requests not really async?, specifically because I make a new request for every fetch.
First: Do you think that could be the case? Could my local server indeed crash, because I have only 3 (probably simultaneous) requests?
Second: How could I approach this problem.
I don't really see the possibility to group the requests, because they are just incoming to the proxy url and every call of the proxy url will create a new Guzzle client with its own request...
(I mean, how many things can a simple PHP server execute at the same time? And why would it not just add requests to the call stack and execute them in order?)
Thanks for any help on this issue.

Related

Why are REST APIs considered stateless if PUT commands can update?

I am a bit confused by the terminology of REST APIs being stateless. For example, if we had a To-Do list API, and one of the endpoints was used to update or delete entries, then each request does not happen in isolation.
If I create an entry before someone else queries the total entries, then their response will depend on my response.
But, PUT is seen as a verb of REST APIs. Can someone help me clear my confusion?
Stateless means that you store the client state on the client and send it with each request instead of storing it on the server. The latter is the classical server side sessions, where you have a session cookie with the session id and the server stores the session data in the database or file system. This does not scale well for Facebook size applications, that's why they rather send the session data with each request. You can ensure that the session data is not modified by the client if you sign it with a private key stored on the server. So there is signature verification by each request, but still it is less expensive than maintaining session data for more than 1M users in a database and syncing it around the globe with multiple servers to solve the single point of failure problem too. They rather send the session data with each request and if it passes the verification, then the request is handled by any node chosen by the load balancer without touching the database to get session data.
As of the part of the question related to concurrent calls, it can be solved with resource versioning. You can send the actual ETag of the resource and use the if-match header with your PUT request so the server will be able to figure out which version you request is based on. If there is a newer version, then the ETag won't match and the server will reject the request. There can be other ways to solve concurrency, it always depends on your application how you handle it.

How to call Salesforce REST API from external web forms

I am a bit confused. The requirement is that we need to create a REST API in Salesforce(Apex class) that has one POST method. Right now, I have been testing it with POSTMAN tool in 2 steps:
Making a POST request first with username, password, client_id, client_secret(that are coming from connected app in Salesforce), grant_type to receive access token.
Then I make another POST request in POSTMAN to create a lead in Salesforce, using the access token I received before and the body.
However, the REST API that I have in Salesforce would be called from various different web forms. So once someone fills out the webform, on the backend it would call this REST API in Salesforce and submits lead request.
I am wondering how would that happen since we can't use POSTMAN for that.
Thanks
These "various different web forms" would have to send requests to Salesforce just like Postman does. You'd need two POST calls (one for login, one to call the service you've created). It'll be bit out of your control, you provided the SF code and proven it works, now it's for these website developers to pick it up.
What's exactly your question? There are tons of libraries to connect to SF from Java, Python, .NET, PHP... Or they could hand-craft these HTTP messages, just Google for "PHP HTTP POST" or something...
https://developer.salesforce.com/index.php?title=Getting_Started_with_the_Force.com_Toolkit_for_PHP&oldid=51397
https://github.com/developerforce/Force.com-Toolkit-for-NET
https://pypi.org/project/simple-salesforce/ / https://pypi.org/project/salesforce-python/
Depending how much time they'll have they can:
cache the session id (so they don't call login every time), try to reuse it, call login again only if session id is blank / got "session expired or invalid" error back
try to batch it somehow (do they need to save these Leads to SF asap or in say hourly intervals is OK? How did YOU write the service, accepts 1 lead or list of records?
be smart about storing the credentials to SF (some secure way, not hardcoded). Ideally in a way that it's easy to use the integration against sandbox or production changing just 1 config file or environment variables or something like that

Sticky Session for Rest API Calls

For browser based request with sticky session true load balancer can restrict request to same JVM out of multiple JVMs in a cluster.
But in case request is coming from REST client rather any browser, how the load balancer can restrict requests to same JVM even sticky session is set as true? Any Idea please.
REST client is made to call REST API and REST APIs should be stateless i.e. complete information about processing of request should be present in request itself, thus request should not dependent on any session data.
If your API is dependent on session data then in actual it is not following some principles of REST.
If your requirement is such that you need to maintain the state then it should be maintained on client side not on server. So one of the way that i will suggest is that you can use cookies to store your state and temp data. While making any REST api call just attach that cookie with request.
You can make cookie configurable so that it will be controlled by server and no one else can make change in it.
The load balancer uses Cookies to keep track of sessions. Retaining the cookies and sending them back in the client should be enough to get the expected result.
For instance, in python, that would mean replacing requests.get(url) with:
s = requests.session()
// ...
s.get(url)

Rest Communication Design For Callback Mechanism

I had a use case that there is a server that can have n number of source. There can be several clients that can connect to this server and get the sources list and then can subscribe to the server to listen the source add, update and delete operation.
To implement this with REST principle, I have thought that first time when the client gets connected, the server gives the full source list along with the session id. Then with this session id, the client polls the url after a configured time interval and listen to the source updates.
The communication will looks like
Client>
GET: /Federation/Sources
Server>>
{"sessionId":xyz,"data":{"source1"...........}}
Client>
GET: /Federation/Sources/{sessionId}
Server>>
{"sessionId":xyz,"data":{"sourceadded"...........}}
Client>
PUT: /Federation/Sources/{sessionId}
{"data":{"Recieved"}}
This client call will then updates the server to remove the source correspond to this session id.
And then client poll continues with the session id.
Can expert please give their feedbacks or comments if this is a good approach or can there be any alternative good approach that can be follow with REST principle?
Instead of passing back id's for the client to use to build the URL, simply pass back the entire URL to the client. Perhaps with more information about what the URL is for. This is the HATEOAS part of REST.

How to send permanently a request to my http server

I have a http server running on my pc that I developed using C++.
I need to send permanently a request from the browser to my server (every 1s) in order to refresh my web page's content.
How can I do it?
Thanks for your help :))
This is the kind of thing that AJAX was designed for. Client-side scripting can send requests, such as in a timer, to update specific areas of the page's content without reloading the entire page each time.
Otherwise, look at HTTP server-side pushing to push new data to the client whenever it changes.