How to DEFLATE with a command line tool to extract a page from Microsoft Server - encoding

How can I view html page from Server: Microsoft-IIS/10.0 with header: "Accept-Encoding: deflate"
I can't decompress such html page in Linux(Centos).
I can decompress with "Accept-Encoding: gzip"(gunzip).
I can decompress with "Accept-Encoding: br"(brotli).
I want to uncompress page from Server: Microsoft-IIS/10.0 with header:
"Accept-Encoding: deflate".
X-Frame-Options: SAMEORIGIN
X-UA-Compatible: IE=Edge
X-AspNet-Version: 4.0.30319
X-Powered-By: ASP.NET
I'm looking for a command line wrapper for the DEFLATE algorithm.
Unfortunately:
zlib-flate -uncompress < deflate.dat > page.html flate: inflate: data: incorrect header check
unpigz -c deflate.dat
unpigz: skipping: deflate.dat is not compressed
openssl zlib -d < deflate.dat > page.html
140264494790544:error:29065064:lib(41):BIO_ZLIB_READ:zlib inflate error:c_zlib.c:548:zlib error:data error

Microsoft and their IIS server is what ruined the deflate HTTP content encoding for everyone. The HTTP standard in RFC 2616 clearly states:
deflate
The "zlib" format defined in RFC 1950 [31] in combination with
the "deflate" compression mechanism described in RFC 1951 [29].
However the authors at Microsoft of IIS did not read the standard, and instead decided on their own that deflate meant raw deflate, just RFC 1951, without the zlib wrapper. Then browsers either didn't work with deflate encoding, or they had to try decoding it both ways, with and without the zlib wrapper. Now the recommendation is to simply not use the deflate encoding, and use gzip instead.
I am not aware of a stock command line tool to decompress raw deflate data. You can compile this with zlib to create such a tool:
// Decompress raw deflate data from stdin to stdout. Return with an exit code
// of 1 if there is an error.
#include <stdio.h>
#include "zlib.h"
#ifdef _WIN32
# include <fcntl.h>
# include <io.h>
# define BINARY() (_setmode(0, _O_BINARY), _setmode(1, _O_BINARY))
#else
# define BINARY()
#endif
int main(void) {
BINARY();
z_stream strm = {0};
int ret = inflateInit2(&strm, -15);
if (ret != Z_OK)
return 1;
unsigned char in[32768], out[32768];
do {
if (strm.avail_in == 0) {
strm.avail_in = fread(in, 1, sizeof(in), stdin);
strm.next_in = in;
}
strm.avail_out = sizeof(out);
strm.next_out = out;
ret = inflate(&strm, Z_NO_FLUSH);
fwrite(out, 1, sizeof(out) - strm.avail_out, stdout);
} while (ret == Z_OK);
inflateEnd(&strm);
return ret != Z_STREAM_END;
}

Related

can't get web page source from url in Swift

I'm currently using SwiftHTTP for getting source of url address. I am using 'get method' for getting source code of url address which is
do {
let opt = try HTTP.GET(self.my_url_address!)
opt.start { response in
if let err = response.error {
print("error: \(err.localizedDescription)")
return
}
print(response.description)
}
} catch let error {
print("got an error creating the request: \(error)")
}
after this code run I got this output in Xcode output screen
URL: http://myweburl.com/detay/swat-under-siege.html
Status Code: 200
Headers: Content-Type: text/html
Connection: keep-alive
CF-RAY: 38391215a60e2726-FRA
Set-Cookie: ASPSESSIONIDSABBBSDT=HPKKPJGCDLKMDMILNGHPCAGD; path=/
Date: Mon, 24 Jul 2017 18:51:24 GMT
Vary: Accept-Encoding
X-Powered-By: ASP.NET Transfer-Encoding: Identity
Server: cloudflare-nginx
Content-Encoding: gzip
Cache-Control: private
The status code is 200 but the output is not the source code of url. How can I fix this?
Response is correct. I've tried requesting the website (the real one) and it works:
print(response.data.base64EncodedString())
If you decode the BASE64 data, it will render valid HTML code.
The issue seems related to encoding. After checking the website's head tag, it states that the charset is windows-1254
String(data: response.data, encoding: .windowsCP1254) // works. latin1, etc.
Your issue is similar to SWIFT: NSURLSession convert data to String

curl command line equivalent to this perl code

I want to write a curl command for a POST request equivalent to this Perl code:
use strict;
use warnings;
use LWP::UserAgent;
my $base = 'http://www.uniprot.org/mapping/';
my $params = {
from => 'ACC',
to => 'P_REFSEQ_AC',
format => 'tab',
query => 'P13368'
};
my $agent = LWP::UserAgent->new();
push #{$agent->requests_redirectable}, 'POST';
my $response = $agent->post($base, $params);
$response->is_success ?
print $response->content :
die 'Failed, got ' . $response->status_line .
' for ' . $response->request->uri . "\n";
I tried with (and many other variants) :
curl -X POST -H "Expect:" --form "from=ACC;to=P_REFSEQ_AC;format=tab; query=P13368" http://www.uniprot.org/mapping/ -o out.tab
The Perl code retrieves the expected result, but the curl command line does not. It retrieves the web page from "http://www.uniprot.org/mapping/" but does not make the POST request.
I looked for an error in the response header, but didn't find anything suspicious.
> POST http://www.uniprot.org/mapping/ HTTP/1.1
> User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3
> Host: www.uniprot.org
> Accept: */*
> Proxy-Connection: Keep-Alive
> Content-Length: 178
> Content-Type: multipart/form-data; boundary=----------------------------164471d8347f
>
} [data not shown]
* HTTP 1.0, assume close after body
< HTTP/1.0 200 OK
< Server: Apache-Coyote/1.1
< Vary: User-Agent
< Vary: Accept-Encoding
< X-Hosted-By: European Bioinformatics Institute
< Content-Type: text/html;charset=UTF-8
< Date: Wed, 05 Aug 2015 20:32:00 GMT
< X-UniProt-Release: 2015_08
< Access-Control-Allow-Origin: *
< Access-Control-Allow-Headers: origin, x-requested-with, content-type
< X-Cache: MISS from localhost
< X-Cache-Lookup: MISS from localhost:3128
< Via: 1.0 localhost (squid/3.1.20)
< Connection: close
<
I spent almost three days looking for a solution in the web, but nothing is working for me.
It looks like the server expects the data as application/x-www-form-urlencoded and not as multipart/form-data as you do with the --form argument. The following should work:
curl -v -L --data \
"from=ACC&to=P_REFSEQ_AC&format=tab&query=P13368" \
http://www.uniprot.org/mapping/ -o out.tab
With --data you get the expected content-type header, but you must do the encoding yourself. With -L curl follows a redirect which is needed here to get the resulting data.
The -X POST option is not needed since POST is the default method when sending data. And -H "Expect:" is not needed either.

logging in to server using Socket Programming

I am trying to write a code that connect our local server, log in to a webpage on that and retrieve some data. I could connect the server using the server IP and connect function. Now I need to log in on a webpage that accept the following format:
addUPI?function=login&user=user-name&passwd=user-password&host-id=xxxx&mode=t/z
I wrote something like this:
int ret= send(sock,"addUPI?funcion...&mode=t",strlen("addUPI?funcion...&mode=t"),0);
but it does not work. Can anybody help me please?
This isn't really the right way to do HTTP. For one thing, the typical HTTP lifecycle looks something like this (very abbreviated):
...Connect
>>> GET / HTTP/1.0
>>> Host: localhost
>>> Referrer: http://www.google.com
>>>
<<< HTTP/1.0 200 OK
<<< Date: Wed, 08 Apr 2015 05:21:32 GMT
<<< Content-Type: text/html
<<< Content-Length: 20
<<< Set-Cookie: ...
<<<
<<< <html><h1>Hello World</h1></html>
And that's assuming there are no redirects, SSL or other mystical protocol happenings. So, just writing the string you specified above is going to result in a closed connection due to not following the protocol.
Really, you probably want to use a fully-baked HTTP library like cURL, which manages all the protocol requirements.
I shamelessly adapted this example from the curl website:
#include <stdio.h>
#include <curl/curl.h>
int main(void)
{
CURL *curl;
CURLcode res;
curl = curl_easy_init();
if(curl) {
curl_easy_setopt(curl, CURLOPT_URL, "http://example.com/addUPI?function=login&user=user-name&passwd=user-password&host-id=xxxx&mode=t");
/* example.com is redirected, so we tell libcurl to follow redirection */
curl_easy_setopt(curl, CURLOPT_FOLLOWLOCATION, 1L);
/* Perform the request, res will get the return code */
res = curl_easy_perform(curl);
/* Check for errors */
if(res != CURLE_OK)
fprintf(stderr, "curl_easy_perform() failed: %s\n",
curl_easy_strerror(res));
/* always cleanup */
curl_easy_cleanup(curl);
}
return 0;
}

redirect to a website in c

I am using the following code to redirect client request. But when doing the following the clients are not redirected. Its showing "Unable to connect" in the browser. I redirect the clients to port 8080 using iptables. And running the following executable to redirect. How to redirect the clients. Please provide solution....
#include <sys/socket.h>
#include <netinet/in.h>
#include <arpa/inet.h>
#include <stdio.h>
#include <stdlib.h>
#include <unistd.h>
#include <errno.h>
#include <string.h>
#include <sys/types.h>
#include <time.h>
#include<stdlib.h>
int main(int argc, char *argv[])
{
int listenfd = 0, connfd = 0;
struct sockaddr_in serv_addr;
char *reply = "HTTP/1.1 301 Moved Permanently\nServer: Apache/2.2.3\nLocation:
http://www.google.com\nContent-Length: 1000\nConnection: close\nContent-Type:
text/html; charset=UTF-8";
char sendBuff[1025];
time_t ticks;
listenfd = socket(AF_INET, SOCK_STREAM, 0);
memset(&serv_addr, '0', sizeof(serv_addr));
memset(sendBuff, '0', sizeof(sendBuff));
serv_addr.sin_family = AF_INET;
serv_addr.sin_addr.s_addr = htonl(INADDR_ANY);
serv_addr.sin_port = htons(8080);
bind(listenfd, (struct sockaddr*)&serv_addr, sizeof(serv_addr));
listen(listenfd, 10);
while(1)
{
connfd = accept(listenfd, (struct sockaddr*)NULL, NULL);
printf("client connected\n");
send(connfd, reply, strlen(reply), 0);
close(connfd);
sleep(1);
}
}
Please refer to the example in this page or this page to construct a valid http response on the server side. Then, add your html body below it.
Minimum you would need is
HTTP/1.1 200 OK
Content-Length: XXXXX <- put size of the your html body
Connection: close
Content-Type: text/html; charset=UTF-8
I am unable to reproduce the error that you see. You should provide more details (e.g., what kind of client, exact text of iptables rule). For my test, I did not set any iptables rule, and instead pointed the Firefox 12.0 browser directly to localhost:8080.
Splitting up your reply so that it is easier to read shows:
char *reply =
"HTTP/1.1 301 Moved Permanently\n"
"Server: Apache/2.2.3\n"
"Location: http://www.google.com\n"
"Content-Length: 1000\n"
"Connection: close\n"
"Content-Type: text/html; charset=UTF-8"
;
Although the RFC specifies \r\n for line terminators, most clients will accept \n (you don't say which client you are using). But, three other glaring issues are that the last line is not terminated, the response itself is not terminated by a blank line, and you have a Content-Length header of 1000, but no content. Any of these issues could be cause for a client to treat the response as invalid and ignore it.
char *reply =
"HTTP/1.1 301 Moved Permanently\r\n"
"Server: Apache/2.2.3\r\n"
"Location: http://www.google.com\r\n"
"Content-Length: 0\r\n"
"Connection: close\r\n"
"Content-Type: text/html; charset=UTF-8\r\n"
"\r\n"
;
Reading farther into your code, you close the connection immediately after sending your reply without first reading the request. This might lead to an (albeit unlikely) race where you close the connection before the request is fully delivered to the server. Then, when the request does arrive, it will trigger a reset to the client, and the response could be dropped. So, you should add code to make the delivery of your reply more robust:
printf("client connected\n");
send(connfd, reply, strlen(reply), 0);
shutdown(connfd, SHUT_WR);
while (recv(connfd, sendBuff, sizeof(sendBuff), 0) > 0) {}
close(connfd);
Given that I cannot reproduce your issue with the response as it is, though, it is also possible that you did not set your iptable redirect rule properly.

500 Server Error HTML returned from MVC AJAX call when plain text specified

I am trying to return plain text from my MVC AJAX methods that indicates an error code. This is working fine on my dev machine, but when deployed to a server (Win2008 R2) I am always getting the HTML of the 500.htm page back in the error.responseText from my AJAX call instead of the text I specified. Any ideas why I would not get back the plain text I intended?
Here is my error handling logic in my controller.
protected override void OnException(
ExceptionContext filterContext
)
{
try
{
Error error = ControllerCommon.ProcessException(filterContext);
// return error
filterContext.Result = HandleError(error.Type);
filterContext.ExceptionHandled = true;
}
catch (Exception ex)
{
Logger.Instance.LogImportantInformation(ex.Message, 0, Constants.EventSourcePortal);
}
}
#endregion
#region Private Methods and Members
private ActionResult HandleError()
{
return HandleError(Error.ErrorType.Unknown);
}
private ActionResult HandleError(
Error.ErrorType errorType
)
{
// set return status code
HttpContext.Response.StatusCode = (int)HttpStatusCode.InternalServerError;
Logger.Instance.LogImportantInformation(((int)errorType).ToString(CultureInfo.InvariantCulture), 0, Constants.EventSourcePortal);
// return error type
return Content(((int)errorType).ToString(CultureInfo.InvariantCulture), "text/plain");
}
Here is the header that I get back from the Server.
Response Headers
Cache-Control private
Content-Type text/html
Server Microsoft-IIS/7.5
X-AspNet-Version 4.0.30319
X-Powered-By ASP.NET
Date Mon, 20 Jun 2011 16:00:42 GMT
Content-Length 1208
Request Headers
Host pqompo2test01.dns.microsoft.com
User-Agent Mozilla/5.0 (Windows NT 6.1; WOW64; rv:2.0.1) Gecko/20100101 Firefox/4.0.1
Accept text/html, /
Accept-Language en-us,en;q=0.5
Accept-Encoding gzip, deflate
Accept-Charset ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive 115
Connection keep-alive
Content-Type application/x-www-form-urlencoded; charset=UTF-8
X-Requested-With XMLHttpRequest
Referer https://pqompo2test01.dns.microsoft.com/Incident/List
Content-Length 330
Cookie MC1=GUID=111c287d88c64447a63719bb2c858981&HASH=7d28&LV=20114&V=3; A=I&I=AxUFAAAAAACJCAAAMuSpPH1Citx6nZO0iHfvdA!!&CS=116|9n002j21b03; WT_FPC=id=131.107.0.73-1717525904.30146426:lv=1308581948698:ss=1308581948698; MUID=7F438FBFEEE948D88DA06B04F6923159; MSID=Microsoft.CreationDate=04/20/2011 16:45:49&Microsoft.LastVisitDate=06/20/2011 15:59:10&Microsoft.VisitStartDate=06/20/2011 15:59:10&Microsoft.CookieId=4c7552d0-5e75-4e5e-98b9-4ca52421a738&Microsoft.TokenId=ffffffff-ffff-ffff-ffff-ffffffffffff&Microsoft.NumberOfVisits=27&Microsoft.CookieFirstVisit=1&Microsoft.IdentityToken=AA==&Microsoft.MicrosoftId=0668-8044-9161-9043; ANON=A=FA4FB528F204DDFA69239A4FFFFFFFFF&E=b4d&W=4; NAP=V=1.1&E=af3&C=hJtCCJq27admlaiwmdzvTmnAwIEVXv1jFR2I2bJ-gncMGQOJce96RQ&W=4; mcI=Wed, 27 Apr 2011 16:48:43 GMT; omniID=fd752842_58d3_4833_9a0f_d0e1e3bbfef3; WT_NVR_RU=0=msdn:1=:2=; ASP.NET_SessionId=dbcqi222tehjorefcopchuzu; MS0=7fc23d65df4241c89554b502149ccc13; MICROSOFTSESSIONCOOKIE=Microsoft.CookieId=94c8a34f-9184-4e10-84c5-2b9c43c7a962&Microsoft.CreationDate=06/20/2011 15:59:10&Microsoft.LastVisitDate=06/20/2011 15:59:10&Microsoft.NumberOfVisits=1&SessionCookie.Id=04DF98242DBD0730C9388487546F2F37; RPSMCA=FAAaARSsz90pZKmFSg5n0wbtR5MnQAldBwNmAAAEgAAACDNAEUimN1wb2AD%2Bp1PnEJUdd7n5VumQIQerCQYdD5IEd6ZCDEshkiTkvVl5a9eA6%2B9a0Os/1FpoqtvsGYMdWUUc98PUl5ZTo%2BFXAqxiZ9BL5D69OLCPsZEXitrZMulmKXFGQiAD5FqJY8JOOSJ1xptRwdkdrxGF8PuNit/Si87Ft7g4sF9vE878lMSx6TSmQq3nrurnBbdbUvDvwTKLoY0gAikOxJ7GmZoLw4kbzaLR/6/a/XSJFv%2BZ6uHsIwkMn6mndoZKfg3LLjDlCpozrHBlnKtgkn7yZXtd8Or420IXuPMUAF3gfp8VAkhKlVceTXpBv2h4gs6g; RPSMCSA=FAAaARSsz90pZKmFSg5n0wbtR5MnQAldBwNmAAAEgAAACEdSXDQ0SIKI2AC/tM6y7CeHdaKVAab/n/4TLKkF5/01jGkXR0vA07MTvS5vhwgjCPMs4zke%2B0jnB1DqOV2vI4VqQ/%2BOIYh52QkaLREoD5L718AjEJOQdDVRRZiIB51CiYtS0P/kgIkEtfDa5yuTr3w6V2IKhy2%2B6wVrP/UqxsJR%2BZ1QmGxtjv7eQVGdIndrkPx5e9wFqj1qEcf9FNfH0/uajuaTFaNmi/3dQfWuEKxGpoHWNxgoMf8PHLVi2hqltqK47OloCGqQGLPQPx0PSg1K73FTZHhl3%2BuxyNqyWJumKsAUAGuMUzFhTPsQ7JdOSfY2SYyHeaZP; RPSShare=1; MSPAuth=1NNm8kdmWAFrAuL2d8qOShxJKehL!CxEkCQvsgPdNGDqo0XFGsreQZ9GMVjiT1*bHPlGcNVsyfbVO7h!eY32bCNY7Farp2grIyEgAFv7YgJqWZN2Q87*LBZnZ0ASWmhPqe; MSPProf=1ZN*xhGN9GRXSO*HEmrISYo6cowSUbmxtIsYfqtHv!!VzEybb1I33*BdWWJrz54tkO5BzS3eTprAXL1LO9ELLBziO8Sm8WTzkSbV*E6ECcX9N92*AFiJztc4rlwCLQnMBhxlV0qzvlRN4dS1SajyzABZDNBTG*tdyqfnuP6jkSevAhuXYvnEuKZQKAF5fvgr4!oiBQ2KhnuH0$; RequestVerificationToken_Lw=TOS6XUQ+17bDOxh2T75NhhFy2KIJP5BP9MetB7cAa4i68ZEHIEpgE7xwQhzid/YiZCm4GsbW2zsJjlIxkB1hrhVGoU++E1I5BP9X2PyKn0O8tic84cWNz8QRjLDcaAcF4iYEQQ==; prmrsdninc=1
When I run locally on my Win7 machine I simply get the text back that I am expecting.
I was having the same problem trying to return 500 pages containing only text for processing by telerik controls. Of course it worked tickity-boo on my development machine and failed when published up to the proper IIS servers.
This fixed it for me:
http://blog.janjonas.net/2011-04-13/asp_net-prevent-iis_75_overriding-custom-error-page-iis-default-error-page
Summary: Type putting a
Response.TrySkipIisCustomErrors = true;
line in your action