Luasocket custom headers, 404 turns to 301 - sockets

My previous question was about fetching page title in lua using the socket.http module. The question lies here. Previously, youtube pages led me to a 404 error page. Based on MattJ's help, I put up custom HOST header for the request. This is what I did and what was the result:
Code
header = { host= "youtube.com" }
local result,b,c,h = http.request{ url = "http://www.youtube.com/watch?v=_eT40eV7OiI", headers = header }
print ( result, b, c, h )
for k,v in pairs(c) do print(k,v) end
Result
1 301 table: 0047D430 HTTP/1.1 301 Moved Permanently
x-content-type-options nosniff
content-length 0
expires Tue, 27 Apr 1971 19:44:06 EST
cache-control no-cache
connection close
location http://www.youtube.com/watch?v=_eT40eV7OiI
content-type text/html; charset=utf-8
date Sat, 28 Apr 2012 04:26:21 GMT
server wiseguy/0.6.11
As far as I was able to understand from this, the error is basically because of X-Content-Type-Options valued nosniff. Reading its documentation, I got to know that the only defined value, "nosniff", prevents Internet Explorer from MIME-sniffing a response away from the declared content-type.
Please help me so that I can use custom proxy and fetch the youtube(and some other sites, as mentioned in the previous question) title from their body. Here is the complete LUA file I currently have:
local http = require "socket.http"
http.PROXY="http://<proxy address here>:8080"
header = { host= "youtube.com" }
local result,b,c,h = http.request{ url = "http://www.youtube.com/watch?v=_eT40eV7OiI", headers = header }
print ( result, b, c, h )
for k,v in pairs(c) do print(k,v) end

I believe this line should be changed:
header = { host= "youtube.com" }
To:
header = { host= "www.youtube.com" }
After that, works for me.

The solution is to install luasec and to use ssl.https module to do the request.
Answered here by Paul Kulchenko!
Example:
-- luasec version 0.4.2
require("ssl")
require("https")
-- ssl.https.request(...)

Related

REST::Client | unable to parse GET response JSON object

This is the first time i am writing a Perl client to consume a REST service. I am using REST::Client and JSON perl module. The web service returns data in JSON format. The problem is when I try to use from_json or decode_json method on client->responseContent() method, I am getting an error saying
"malformed JSON string, neither array, object, number, string or atom, at character offset 0 (before "HTTP/1.1 200 \r\nCon..."
The web service is ofcourse a stable one and works fine with other languages REST clients.
After debugging the issue I found that client->responseContent() not only contains the JSON data but also the header information hence from_json is unable to parse it. Below is the snippet of the code:
my $url = "/data";
my $client = REST::Client->new();
$client->setHost($host);
my $headers = {Accept => 'application/json'};
$client->GET($url, $headers);
my $response = from_json($client->responseContent());
Not able to figure out this thing from two days now :-(
Here is the dump of "$client->{_res}->dump"
Fri Feb 23 09:38:35 2018: HTTP/0.9 200 EOF
Client-Date: Fri, 23 Feb 2018 09:38:35 GMT
Client-Peer: 45.32.84.105:8282
Client-Response-Num: 1
HTTP/1.1 200 \r
Content-Type: application/json;charset=UTF-8\r
Transfer-Encoding: chunked\r
Date: Fri, 23 Feb 2018 09:38:33 GMT\r
Connection: close\r
\r
2000\r
[{"REGION":"AP","REMARK":null,"STATUS":"PROD","UPDATED_TIME":null,"UPDATED_BY":null,"ROUTE_ID":1,"ROUTE_ID_VER":20150310,"USER_ROUTE_LOGIC":"|CAPTIVE|","USER_DEST":null,"USER_ORDSIZE_TYPE":null,"MIN_USER_ORDSIZE_VAL":0,"MAX_USER_ORDSIZE_VAL":100,"TAG_775":"|1|","CROSS_CURRENCY":"|Y|N|","TAG_12703":"|PB-CS|","COUNTRY":"|AU|HK|ID|IN|JP|KR|MY|SG|","TAG_12207":...
(+ 423449 more bytes not shown)
Even when the transfer encoding is not chunked, I am getting the same issue;
Fri Feb 23 10:40:20 2018: HTTP/1.1 200 ^M
Content-Type: application/json;charset=utf-8^M
Content-Length: 1618^M
Date: Fri, 23 Feb 2018 10:40:20 GMT^M
Connection: close^M
^M
{ "data":[ {
"REGION" : "AP",
"REMARK" : "",
"STATUS" : "PROD",
"UPDATED_TIME" : "",
"UPDATED_BY" : "",
Ultimately, solved using curl command for now (Actual code snippet below):
my $command = "curl '$url'";
my $rules = qx/$command/;
I have a similar issue that I am working through. I suspect (Have not verified yet) that the root cause is that the server is not passing the Encoding-Content Response Header, therefore LWP::UserAgent and HTTP::Response are not attempting to return the decoded string. I am going to try to isolate where this decision is being made and see what options are available and request a patch.
Perl REST::Client - Garbage data in response

Docker API returns 200 OK then 400 BAD REQUEST

I am writing an API client for Docker. I understood from the documentation that the API is Restful/HTTP, yet if you connect to the local daemon you have to do it over the exposed unix socket.
It all seems to work, I open a socket, send an HTTP request (which respects the specification), I receive the expected response, but also a 400 BAD REQUEST response follows immediately.
Here is the request:
GET /info HTTP/1.1
Host: localhost
Accept: application/json
And here is what I get:
HTTP/1.1 200 OK
Api-Version: 1.30
Content-Type: application/json
Docker-Experimental: false
Ostype: linux
Server: Docker/17.06.1-ce (linux)
Date: Thu, 01 Feb 2018 18:53:18 GMT
Transfer-Encoding: chunked
892
{"ID":"6MGE:35TO:BI..." ...}
0
HTTP/1.1 400 Bad Request
Content-Type: text/plain; charset=utf-8
Connection: close
400 Bad Request
First, I figured that there is a bug on my side and I am somehow sending 2 requests, but I enabled debugging and followed the logs with sudo journalctl -fu docker.service and there is exactly one request received... at least one is logged, the GET /info. I've also debugged the code and 1 single request is sent.
Any hint is greatly appreciated!
Edit: here is the client's code:
final StringBuilder hdrs = new StringBuilder();
for(final Map.Entry<String, String> header : headers) {
hdrs.append(header.getKey() + ": " + header.getValue())
.append("\r\n");
}
final String request = String.format(
this.template(), method, home, hdrs, this.readContent(content)
);
final UnixSocketChannel channel = UnixSocketChannel.open(
new UnixSocketAddress(this.path)
);
final PrintWriter writer = new PrintWriter(
Channels.newOutputStream(channel)
);
writer.print(request);
writer.flush();
final InputStreamReader reader = new InputStreamReader(
Channels.newInputStream(channel)
);
CharBuffer result = CharBuffer.allocate(1024);
reader.read(result);
result.flip();
System.out.println("read from server: " + result.toString());
It seems like you have an extra CRLF between headers and body.
private String template() {
final StringBuilder message = new StringBuilder();
message
.append("%s %s HTTP/1.1\r\n")
.append("Host: localhost").append("\r\n")
.append("%s")
.append("\r\n").append("\r\n") //one of these is superfluous, as each header line ends with "\r\n" itself
.append("%s");
return message.toString();
}
Remove one append("\r\n") after headers and see what happens.
Fixed. Initially, I thought the problem was with the line endings (that they should have been \n instead of \r\n). Turns out, the 400 BAD REQUEST occured because the Connection: close header was missing, while the Request made was being closed right after receiving the response.
More details here.

Malformed Session ID Cookie

Set-Cookie: SESSIONID=836cfc64b5856712b040a0b1b3bf4237; Secure; HttpOnly
Set-Cookie: Watson-DPAT=gBJQjAG%2FYflxpHKCwJVswQPEBuUmikj38zzFm8UZNTbOERxbeXS4WVxBIT5JetJBkeO1RT16PNz6%2BI17oFrEqvxjny%2FifZRorvBxXVzDmFkRpfRLxxj6ZNvFCvuRL1DtfW3nL8Ne1QDwpuKQmNt8%2BD9vFk7bGjlaziHT0ZFhNffWJT7FRCWbuJAyjKd%2BQui2WTIl6B8KglPi6GG1buh5UPDE%2Bc8OvrqyAfJRfYOApRdx7kHhtHdxIV7g%2FzNExXhafScqxi4cWEa5Kg9YGcypr8SIO%2FD7WOq0KyERHUDkbZatH53CCii25it5XD0plnt3cVc4bWs8tXkMT82V9DwCYULto64L%2BgNh30iTpyv72xOIfHeZTt2KISfhXMy6z86ueaJZzNd4nS6rwc7s2E8ldxwYLXrCU996xsLmzPYbGSzaeFLplG7c%2BCxzTlAll5fn8eMMbGn30W%2BrXLNtcaJ3lRK2nvzQCim1GhMdqoOvOcSvPWiJoVBrF8lc75eGSr8C%2Fovq20fOk3NDw4f0UPfBEGZYuAtXjonU7QdRhSgLRXxKyGvcYHEWeWUOQ2kvtI2m%2FRD%2BMRx9384p1v6uu8XfaU16IqoidV0Vew3MLPW4fxcOWRqnWKy0iIYbIJrWVcigloIy%2FNxgO7oHW7aacgH1u8IluAURz5AiE1Bej4l%2FjAI91IUTEssbg6fsXd3AqmlkixDglDJBgTEtMoXhXVyDjvJSaVUqdFTokP3YcRNhlTzqDQ3vG8txTLzECsyQHZ7DgWp%2B98P3zjvtad9xB%2BDzXhF4CaUB7ve99bWO5FO1DU3KRhx7pEAKGselDCoxTOkjIhEMMJbeQrbC1QWJ4uR9KlBPBdIbShd3; path=/speech-to-text/api; secure; HttpOnly
The request to create the Speech to Text session works.
{
"recognize": "https://stream.watsonplatform.net/speech-to-text/api/v1/sessions/836cfc64b5856712b040a0b1b3bf4237/recognize",
"recognizeWS": "wss://stream.watsonplatform.net/speech-to-text/api/v1/sessions/836cfc64b5856712b040a0b1b3bf4237/recognize",
"observe_result": "https://stream.watsonplatform.net/speech-to-text/api/v1/sessions/836cfc64b5856712b040a0b1b3bf4237/observe_result",
"session_id": "836cfc64b5856712b040a0b1b3bf4237",
"new_session_uri": "https://stream.watsonplatform.net/speech-to-text/api/v1/sessions/836cfc64b5856712b040a0b1b3bf4237"
}
Then I try to get the status of the session to make sure the state is "initialized" but I get a "Malformed Session ID Cookie" error.
GET /speech-to-text/api/v1/sessions/836cfc64b5856712b040a0b1b3bf4237/recognize HTTP/1.1\x0d
Content-Length: 0\x0d
Accept-Encoding: gzip\x0d
Authorization: Basic OGIyMTk0MDYtYWYzYS00YTFhLWExYmMtZDA3ZjNlNTY2Y2JmOm1lYmpBaG1ndkhMSw==\x0d
Cookie: SESSIONID=836cfc64b5856712b040a0b1b3bf4237; Watson-DPAT=gBJQjAG%2FYflxpHKCwJVswQPEBuUmikj38zzFm8UZNTbOERxbeXS4WVxBIT5JetJBkeO1RT16PNz6%2BI17oFrEqvxjny%2FifZRorvBxXVzDmFkRpfRLxxj6ZNvFCvuRL1DtfW3nL8Ne1QDwpuKQmNt8%2BD9vFk7bGjlaziHT0ZFhNffWJT7FRCWbuJAyjKd%2BQui2WTIl6B8KglPi6GG1buh5UPDE%2Bc8OvrqyAfJRfYOApRdx7kHhtHdxIV7g%2FzNExXhafScqxi4cWEa5Kg9YGcypr8SIO%2FD7WOq0KyERHUDkbZatH53CCii25it5XD0plnt3cVc4bWs8tXkMT82V9DwCYULto64L%2BgNh30iTpyv72xOIfHeZTt2KISfhXMy6z86ueaJZzNd4nS6rwc7s2E8ldxwYLXrCU996xsLmzPYbGSzaeFLplG7c%2BCxzTlAll5fn8eMMbGn30W%2BrXLNtcaJ3lRK2nvzQCim1GhMdqoOvOcSvPWiJoVBrF8lc75eGSr8C%2Fovq20fOk3NDw4f0UPfBEGZYuAtXjonU7QdRhSgLRXxKyGvcYHEWeWUOQ2kvtI2m%2FRD%2BMRx9384p1v6uu8XfaU16IqoidV0Vew3MLPW4fxcOWRqnWKy0iIYbIJrWVcigloIy%2FNxgO7oHW7aacgH1u8IluAURz5AiE1Bej4l%2FjAI91IUTEssbg6fsXd3AqmlkixDglDJBgTEtMoXhXVyDjvJSaVUqdFTokP3YcRNhlTzqDQ3vG8txTLzECsyQHZ7DgWp%2B98P3zjvtad9xB%2BDzXhF4CaUB7ve99bWO5FO1DU3KRhx7pEAKGselDCoxTOkjIhEMMJbeQrbC1QWJ4uR9KlBPBdIbShd3\x0d
User-Agent: Mojolicious (Perl)\x0d
Host: stream.watsonplatform.net\x0d
HTTP/1.1 400 Bad Request\x0d
X-Backside-Transport: FAIL FAIL\x0d
Connection: Keep-Alive\x0d
Transfer-Encoding: chunked\x0d
X-Error-Cause: Zuul Error: Malformed Session ID Cookie\x0d
Content-Type: application/json\x0d
Date: Wed, 01 Jun 2016 19:41:26 GMT\x0d
Server: -\x0d
X-Global-Transaction-ID: 237895544\x0d
X-DP-Watson-Tran-ID: stream-dp01-c0182762-b9fe-4533-acab-7fbeb02b63dd\x0d
The code is using a single instance of Mojo::UserAgent so the cookies are maintained on each request.
when using sessions you receive a SESSIONID cookie when creating the session, that cookie you need to send back to the service on every call you make after creating the session. Please note that the value of that cookie does not equal "session_id": "836cfc64b5856712b040a0b1b3bf4237", it is a longer alphanumeric string.
btw. why are you using sessions? what is your use case? maybe you could benefit from sessionless calls (simpler) or websockets (better for live use cases)
Dani
Using a trailing slash is the cause of this error. The interesting part is that the "start session" POST request with the trailing slash URL will succeed and return the correct JSON data. The next request to get the session status will fail. Not really a code problem. I also demonstrated the issue with curl.
my $ua = Mojo::UserAgent->new();
$ua->proxy->detect();
$ua->inactivity_timeout(0);
# THIS URL WORKS - no trailing slash
my $start_session_url = "https://${watson_username}:${watson_password}\#stream.watsonplatform.net/speech-to-text/api/v1/sessions";
# THIS URL DOES NOT WORK - with trailing slash
# my $start_session_url = "https://${watson_username}:${watson_password}\#stream.watsonplatform.net/speech-to-text/api/v1/sessions/";
my $session_tx = $ua->post($start_session_url);
my $response;
my $recognize_url;
if($response = $session_tx->success) {
print Dumper($response->json);
$recognize_url = $response->json->{recognize};
} else {
die "Failure to start session";
}
$recognize_url =~ s/https:\/\//https:\/\/${watson_username}:${watson_password}\#/;
# Malformed Cookie error happens here
my $status_tx = $ua->get($recognize_url);
if($response = $status_tx->success) {
print Dumper($response->json);
} else {
die "Failure to get session status";
}

Angular2 Http Response missing header key/values

I'm making an http.patch call to a REST API that is successful (Status 200) but not all the response headers key/values are being returned. I'm interested in the ETag key/value.
Here is a code snippet:
let etag:number = 0;
let headers = new Headers();
headers.append('Content-Type', 'application/json');
headers.append('If-Match', String(etag));
this.http.patch(
'http://example.com:9002/api/myresource/',
JSON.stringify(dto),
{headers: headers}
)
.subscribe(
(response:Response) => {
let headers:Headers = response.headers;
let etag:String = headers.get('ETag');
console.log(etag);
}
);
When making the same call with a REST Client (Postman), the response header contains:
Content-Type: application/hal+json;charset=UTF-8
Date: Mon, 01 Feb 2016 05:21:09 GMT
ETag: "1"
Last-Modified: Mon, 01 Feb 2016 05:15:32 GMT
Server: Apache-Coyote/1.1
Transfer-Encoding: chunked
X-Application-Context: application:dev:9002
Is the missing response header key/values a bug?
Can the issue be resolved with configuration?
This isn't an Angular issue, rather a CORS one. By definition, CORS will only return six "simple" headers: Cache-Control, Content-Language, Content-Type, Expires, Last-Modified and Pragma.
That's why you see the full set when using a REST client such as Postman, yet when calling from your Angular client, you'll only see the set limited by CORS.
To solve this, you'll need to add an Access-Control-Expose-Headers header along the following lines:
let headers = new Headers();
headers.append('Access-Control-Expose-Headers', 'etag');
let options = new RequestOptions({ headers: headers });
return this.http.get(uri, options).map(this.extractData).catch(this.catchError);
Note that you may need to augment the server side code to support the required exposed headers.
In my case (C#), I revised the EnableCors call (within WebApiConfig) to include "ETAG" in the list of exposed headers (the fourth parameter of the EnableCorsAttribute function).

How can I get both the GET and POST request params, on a POST request?

I'm creating a facebook app with a Perl backend. The problem is that since Facebook sends the request to my web app as a POST request I'm having a problem getting the GET parameters that were also part of the base URL for the application -- in effect I'm only getting the POST params from $CGI->Vars.
See CGI/MIXING POST AND URL PARAMETERS.
Short version: use $CGI->param() for post paramenters and $CGI->url_param() for query string parameters.
Dump CGI in favour of a better interface. Plack's param method returns GET and POST parameters mixed.
plackup -MPlack::Request -e 'sub {
my ($env) = #_;
my $r = Plack::Request->new($env);
return [200, ["Content-Type" => "text/plain"], [join "\n", $r->param("foo")]];
}'
> lwp-request -m POST -USe 'http://localhost:5000/fnord?foo=bar;baz=quux'
Please enter content (application/x-www-form-urlencoded) to be POSTed:
foo=123;baz=456
␄
POST http://localhost:5000/fnord?foo=bar;baz=quux
User-Agent: lwp-request/6.03 libwww-perl/6.03
Content-Length: 16
Content-Type: application/x-www-form-urlencoded
200 OK
Date: Thu, 27 Oct 2011 21:27:46 GMT
Server: HTTP::Server::PSGI
Content-Length: 7
Content-Type: text/plain
Client-Date: Thu, 27 Oct 2011 21:27:46 GMT
Client-Peer: 127.0.0.1:5000
Client-Response-Num: 1
bar
123
Just set $CGI::APPEND_QUERY_STRING = 1;