Submitting concurrent HTTP requests with RestSharp - rest

I have a RESTful API running as a self hosted OWIN application, and a Client that's making API calls to the server. Both are done using .Net/WebAPI/RestSharp. In a normal scenario the client sends an HTTP request and receives an HTTP response and everything works as expected. However, there are cases when the client needs to send another request before receiving a response from the previous request. Something like:
HTTP POST Request1 <-- this is a long running operation on the server side
HTTP POST Request2 <-- this comes from a different thread
HTTP POST Response2
HTTP POST Response1
The problem is that Request2 doesn't send until Response1 arrives. These requests are blocking calls (synchronous) and I cannot make them asynchronous.
I tried setting System.Net.ServicePointManager.DefaultConnectionLimit to 20 before initializing my RestClient, but that didn't help.
System.Net.ServicePointManager.DefaultConnectionLimit = 20;
var Client = new RestClient("https://someURL");
Any idea on what the problem might be? and what can I do to send those requests concurrently?

This was a thread synchronization issue in my server. I was dealing with a deadlock situation.

Using RestClient.ExecuteAsync() instead of .Execute() may solve the problem.

Related

Apache httpclient 4.5.2 not sending POST data on retry via ServiceUnavailableRetryStrategy

I have setup the HttpClient with ServiceUnavailableRetryStrategy. I am making a POST call to the server with JSON payload in InputStreamEntity.
When server returns with "503 Service Unavailable" the retry strategy kicks in and makes another request after a delay read from Retry-After header.
I have enabled the logs and what I observe is, in the first request the JSON payload is sent properly. But on the retry request no payload is sent to the server and I am receiving a "400 Bad Request".
I wonder why the payload is not sent on the retry request. In my implementation of ServiceUnavailableRetryStrategy I am not messing anything with HttpResponse or HttpContext. Is it because that once the content is read from InputStreamEntity, the Entity looses the data?
Is there anyway to prevent this from happening? Any help or suggestions will be appreciated.
thanks
The issue was actually with InputStreamEntity, which is a non-repeatable Entity. This means its content cannot be read more than once. So on retry the original Entity content was lost.
When I use a repeatable Enity like ByteArrayEntity, the issue was solved.
https://hc.apache.org/httpcomponents-client-4.5.x/current/tutorial/html/fundamentals.html#d5e119

Vert.x GET api returns 400 Bad Request for some traffic

We have a Vert.x application where we are hosting few HTTP GET and POST api's. While testing on production we are facing an issue where one GET api response to client is 400 - Bad Request.
This api works most of the times but for few clients it is giving 400 - Bad Request.
We have verified that client is correctly sending the request but getting 400 in response.
But on server where application is running we could not find any logs.
There is an log statement at the first line of Handler which is not printing for 400 - Bad Request case and it is being printed for all successful reqeusts.
After lots of debugging what we identified that requests are reaching vertx application and router is rejecting the requests with 400 - Bad Request.
By default vertx accepts the requests with total Header size of 8kb. In some of the cases the Cookies in Headers were coming more than 8KB in size, hence the error.
We can override the Header size limit in HttpServerOptions. After that issue was resolved.
vertx.createHttpServer
(new HttpServerOptions().setSsl(true)
.setMaxHeaderSize(32 * 1024)
.setTcpKeepAlive(true))
.requestHandler(router::accept)
.listen(config().getJsonObject("http").getInteger("port"), handler -> {
futureHttpServer.complete();
Logger.debug("httpservice deployed");
});

In Vertx webclient how to log http request and response

What is the simplest way for logging outgoing http request and response in Vertx WebClient. I'm looking for something similar to the httpserver LoggerHandler, but then for the webclient thus outgoing requests.
WebClient has an overloaded create(Vertx, WebClientOptions) method.
WebClientOptions has a setLogActivity() method that accepts a boolean parameter that indicates if network activity should be logged.
(disclaimer: i haven't tried this myself so i can't vouch for what's actually logged, but see if that covers your needs).

HTTP status code for an effectless request

I'm designing a small RESTfull API for a media player and encoder. There you can start, pause and stop a stream or recording.
Lets assume the service is idle - theres no encoding activity. Now the client sends a request to the service like
POST media.box/api/stream
action=stop
This obviously has no effect at the server side but the client should be noticed that theres something wrong with the request.
What HTTP status code is the most suitable for this case?
If you feel that that is an error condition, you should consider returning 422 (Unprocessable Entity). It indicates that the request was received, but a semantic error in the request prevented it from being executed.
The other school of thought is that no-op requests like "stop everything!" when nothing is running should just say "Ok! Nothing is running anymore." and return gracefully. You'll have to decide which is right for your API.

POST from WinForms app using HttpWebRequest to webservice doesn't work when sent through Fiddler

I'm using HttpWebRequest in a VB.Net WinForms app to get data from an inhouse webservice. The code I'm using works for both GET and POST when run while Fiddler is not running. If I have Fiddler running the GETs work and are captured but a POST doesn't complete. In this case Fiddler captures the initial request but never gets the response and the application doesn't get a response.
The code builds a HttpWebRequest for the POST setting the appropriate properties, encodes the data to be sent into JSON and then does this.
Using postStream As Stream = webrequestobj.GetRequestStream()
postStream.Write(WebServiceByteData, 0, WebServiceByteData.Length)
End Using
I used WireShark to capture the generated network packets and noticed that when a POST is sent without going through Fiddler the following happens.
When "postStream As Stream = webrequestobj.GetRequestStream()" is executed a packet with all of the header info is sent that includes a "Expect: 100-continue" header but doesn't have the request data.
When the postStrean.Write call is executed an additional packet is sent that has the request data.
With Fiddler running nothing is put on the wire until after the postStream.Write is executed. At that point both the header packet with the "Expect: 100-continue" header and the request data packet are sent back to back before the service has responded with the "100 Continue". I'm guessing that this confuses the webservice as it doesn't expect to get the request data packet yet. It doesn't respond with the requested data.
I used Composer to manually create the request without the "Expect: 100-continue" header. When this is executed the same two packets are generated and the service responds with the expected data.
So, in order to be able to use Fiddler to capture the POST traffic it looks like I need to either be able to tell HttpWebRequest to not issue the "Expect: 100-continue" header (I've looked but haven't found a way to do this) or for Fiddler to handle the packets differently, maybe not sending the second packet until it sees the "100 Continue" response or by stripping out the "Expect: 100-continue" header.
It's possible that I've missed a setup option in Fiddler but nothing I've tried so far makes any difference.
Thanks,
Dave
Old question, but the short answer is that the lack of a 100/Continue response shouldn't have mattered at all.
To learn more about Expect: Continue, including how to remove this header if you like, see http://blogs.msdn.com/b/fiddler/archive/2011/11/05/http-expect-continue-delays-transmitting-post-bodies-by-up-to-350-milliseconds.aspx