I'm running a Raspberry Pi 4 as a server for a .NET Core side-project of mine. Nothing too fancy or heavy. After trying to get going with a webhook and uploading files with scp to the Pi and failed (still don't know why at this point; scp problem might be the same as cURL problem), I decided to make myself a small API which accepts a file and deploys it to the specified path. The API is working both from inside and outside the Pi as I've tested it using cURL and Postman with a 20MB zip file, but when I run this command from inside a GitHub Action, I get a long waiting time and then a fail message.
Command:
curl --request POST --url https://example.com/ --header 'cache-control: no-cache' --form path=DEPLOY_PATH --form archive=#FILE_PATH --form token=TOKEN
Output:
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
0 8776k 0 0 0 65536 0 42750 0:03:30 0:00:01 0:03:29 42722
0 8776k 0 0 0 65536 0 25862 0:05:47 0:00:02 0:05:45 25852
0 8776k 0 0 0 65536 0 18533 0:08:04 0:00:03 0:08:01 18528
0 8776k 0 0 0 65536 0 14444 0:10:22 0:00:04 0:10:18 14441
0 8776k 0 0 0 65536 0 11833 0:12:39 0:00:05 0:12:34 13091
0 8776k 0 0 0 65536 0 10022 0:14:56 0:00:06 0:14:50 0
0 8776k 0 0 0 65536 0 8691 0:17:14 0:00:07 0:17:07 0
...
0 8776k 0 0 0 65536 0 63 39:37:37 0:17:11 39:20:26 0
0 8776k 0 0 0 65536 0 63 39:37:37 0:17:11 39:20:26 0
curl: (55) SSL_write() returned SYSCALL, errno = 110
##[error]Process completed with exit code 55.
With both scp and cURL commands there seems to be a common problem. If I try to send a simple text file or a tar.gz containing a text.file, it works. If I try to do the same with a .dll file or a tar.gz containing a .dll file, it does not. I don't really know if the problem is because of the files or their size. To be noted that the API accepts files as big as 100MB at the moment and I'm only trying to deploy a small package of ~10MB.
Output with -v arg:
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0* Trying IP...
* TCP_NODELAY set
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0* Connected to URL (IP) port 443 (#0)
* ALPN, offering h2
* ALPN, offering http/1.1
* successfully set certificate verify locations:
* CAfile: /etc/ssl/certs/ca-certificates.crt
CApath: /etc/ssl/certs
} [5 bytes data]
* TLSv1.3 (OUT), TLS handshake, Client hello (1):
} [512 bytes data]
* TLSv1.3 (IN), TLS handshake, Server hello (2):
{ [112 bytes data]
* TLSv1.2 (IN), TLS handshake, Certificate (11):
{ [2861 bytes data]
* TLSv1.2 (IN), TLS handshake, Server key exchange (12):
{ [300 bytes data]
* TLSv1.2 (IN), TLS handshake, Server finished (14):
{ [4 bytes data]
* TLSv1.2 (OUT), TLS handshake, Client key exchange (16):
} [37 bytes data]
* TLSv1.2 (OUT), TLS change cipher, Client hello (1):
} [1 bytes data]
* TLSv1.2 (OUT), TLS handshake, Finished (20):
} [16 bytes data]
* TLSv1.2 (IN), TLS handshake, Finished (20):
{ [16 bytes data]
* SSL connection using TLSv1.2 / ECDHE-RSA-CHACHA20-POLY1305
* ALPN, server accepted to use http/1.1
* Server certificate:
* subject: CN=URL
* start date: Sep 18 16:51:41 2020 GMT
* expire date: Dec 17 16:51:41 2020 GMT
* subjectAltName: host "URL" matched cert's "URL"
* issuer: C=US; O=Let's Encrypt; CN=Let's Encrypt Authority X3
* SSL certificate verify ok.
} [5 bytes data]
> POST /api/Deployment/ HTTP/1.1
> Host: URL
> User-Agent: curl/7.58.0
> Accept: */*
> cache-control: no-cache
> Content-Length: 3333
> Content-Type: multipart/form-data; boundary=------------------------ca1748c91973ca89
> Expect: 100-continue
>
{ [5 bytes data]
< HTTP/1.1 100 Continue
} [5 bytes data]
100 3333 0 0 100 3333 0 2154 0:00:01 0:00:01 --:--:-- 2153
100 3333 0 0 100 3333 0 1307 0:00:02 0:00:02 --:--:-- 1307
100 3333 0 0 100 3333 0 938 0:00:03 0:00:03 --:--:-- 938
100 3333 0 0 100 3333 0 732 0:00:04 0:00:04 --:--:-- 732
100 3333 0 0 100 3333 0 600 0:00:05 0:00:05 --:--:-- 614
100 3333 0 0 100 3333 0 508 0:00:06 0:00:06 --:--:-- 0
100 3333 0 0 100 3333 0 441 0:00:07 0:00:07 --:--:-- 0
...
100 3333 0 0 100 3333 0 57 0:00:58 0:00:57 0:00:01 0
100 3333 0 0 100 3333 0 56 0:00:59 0:00:58 0:00:01 0
100 3333 0 0 100 3333 0 55 0:01:00 0:00:59 0:00:01 0
100 3333 0 0 100 3333 0 54 0:01:01 0:01:00 0:00:01 0* Empty reply from server
100 3333 0 0 100 3333 0 54 0:01:01 0:01:00 0:00:01 0
* Connection #0 to host URL left intact
curl: (52) Empty reply from server
##[error]Process completed with exit code 52.
EDIT: Switching to a Windows runner instead of Ubuntu solved the cURL problem, but I'm still open to suggestions regarding this question, as this is merely a workaround rather than a solution.
Related
When I do get Download Link usining GitHub API and I want to download artifact it gets straight to the Artifact has expired.
echo "ALLMODS_URL = $(curl -v -H "Accept: application/vnd.github+json" -H "Authorization: token ${{ secrets.GITHUB_TOKEN }}" -L "https://api.github.com/repos/repo/repo/actions/artifacts/124384712/zip")"
curl -v -O $ALLMODS_URL
The Github Log:
+ curl -v -H Accept: application/vnd.github+json -H Authorization: token *** -L https://api.github.com/repos/repo/repo/actions/artifacts/124384712/zip
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0* Trying 140.82.114.5:443...
* Connected to api.github.com (140.82.114.5) port 443 (#0)
* ALPN, offering h2
* ALPN, offering http/1.1
* successfully set certificate verify locations:
* CAfile: /etc/ssl/certs/ca-certificates.crt
* CApath: /etc/ssl/certs
} [5 bytes data]
* TLSv1.3 (OUT), TLS handshake, Client hello (1):
} [512 bytes data]
* TLSv1.3 (IN), TLS handshake, Server hello (2):
{ [122 bytes data]
* TLSv1.3 (IN), TLS handshake, Encrypted Extensions (8):
{ [19 bytes data]
* TLSv1.3 (IN), TLS handshake, Certificate (11):
{ [2456 bytes data]
* TLSv1.3 (IN), TLS handshake, CERT verify (15):
{ [79 bytes data]
* TLSv1.3 (IN), TLS handshake, Finished (20):
{ [36 bytes data]
* TLSv1.3 (OUT), TLS change cipher, Change cipher spec (1):
} [1 bytes data]
* TLSv1.3 (OUT), TLS handshake, Finished (20):
} [36 bytes data]
* SSL connection using TLSv1.3 / TLS_AES_128_GCM_SHA256
* ALPN, server accepted to use h2
* Server certificate:
* subject: C=US; ST=California; L=San Francisco; O=GitHub, Inc.; CN=*.github.com
* start date: Mar 16 00:00:00 2022 GMT
* expire date: Mar 16 23:59:59 2023 GMT
* subjectAltName: host "api.github.com" matched cert's "*.github.com"
* issuer: C=US; O=DigiCert Inc; CN=DigiCert TLS Hybrid ECC SHA384 2020 CA1
* SSL certificate verify ok.
* Using HTTP2, server supports multi-use
* Connection state changed (HTTP/2 confirmed)
* Copying HTTP/2 data in stream buffer to connection buffer after upgrade: len=0
} [5 bytes data]
* Using Stream ID: 1 (easy handle 0x400008b650)
} [5 bytes data]
> GET /repos/repo/repo/actions/artifacts/124384712/zip HTTP/2
> Host: api.github.com
> user-agent: curl/7.74.0
> accept: application/vnd.github+json
> authorization: token ***
>
{ [5 bytes data]
* TLSv1.3 (IN), TLS handshake, Newsession Ticket (4):
{ [57 bytes data]
* TLSv1.3 (IN), TLS handshake, Newsession Ticket (4):
{ [57 bytes data]
* old SSL session ID is stale, removing
{ [5 bytes data]
* Connection state changed (MAX_CONCURRENT_STREAMS == 100)!
} [5 bytes data]
< HTTP/2 410
< server: GitHub.com
< date: Mon, 24 Oct 2022 17:48:11 GMT
< content-type: application/json; charset=utf-8
< content-length: 134
< x-github-media-type: github.v3; format=json
< x-ratelimit-limit: 1000
< x-ratelimit-remaining: 999
< x-ratelimit-reset: 1666637291
< x-ratelimit-used: 1
< x-ratelimit-resource: core
< access-control-expose-headers: ETag, Link, Location, Retry-After, X-GitHub-OTP, X-RateLimit-Limit, X-RateLimit-Remaining, X-RateLimit-Used, X-RateLimit-Resource, X-RateLimit-Reset, X-OAuth-Scopes, X-Accepted-OAuth-Scopes, X-Poll-Interval, X-GitHub-Media-Type, X-GitHub-SSO, X-GitHub-Request-Id, Deprecation, Sunset
< access-control-allow-origin: *
< strict-transport-security: max-age=31536000; includeSubdomains; preload
< x-frame-options: deny
< x-content-type-options: nosniff
< x-xss-protection: 0
< referrer-policy: origin-when-cross-origin, strict-origin-when-cross-origin
< content-security-policy: default-src 'none'
< vary: Accept-Encoding, Accept, X-Requested-With
< x-github-request-id: 0400:1C3C:8DAF00E:12259BA9:6356CFDB
<
{ [134 bytes data]
100 134 100 134 0 0 429 0 --:--:-- --:--:-- --:--:-- 448
* Connection #0 to host api.github.com left intact
+ echo ALLMODS_URL = {
"message": "Artifact has expired",
"documentation_url": "https://docs.github.com/rest/reference/actions#download-an-artifact"
}
ALLMODS_URL = {
"message": "Artifact has expired",
"documentation_url": "https://docs.github.com/rest/reference/actions#download-an-artifact"
}
+ curl -v -O
curl: no URL specified!
curl: try 'curl --help' or 'curl --manual' for more information
Error: Process completed with exit code 2.
PS:Have been trying to use action there for downloading artifacts but it fails for me because Permission Denied
We have haproxy in front of 2 apache servers and every day for less than a minute I am getting NOSRV errors in haproxy logs. There are successful requests from the source IP so this is just intermittent. There is no entry of any error in the backend logs.
Below is the snippet from access logs:
Dec 22 20:21:25 proxy01 haproxy[3000561]: X.X.X.X:60872 Local_Server~ Local_Server/<NOSRV> -1/-1/-1/ -1 0 0 0 {} "POST /xxxxtransaction HTTP/1.1" 0 0
Dec 22 20:21:26 proxy01 haproxy[3000561]: X.X.X.X:43212 Local_Server~ Local_Server/<NOSRV> -1/-1/-1/ -1 0 0 0 {} "POST /xxxxtransaction HTTP/1.1" 0 0
Dec 22 20:21:26 proxy01 haproxy[3000561]: X.X.X.X:43206 Local_Server~ Local_Server/<NOSRV> -1/-1/-1/ -1 0 0 0 {} "POST /xxxxtransaction HTTP/1.1" 0 0
Dec 22 20:21:26 proxy01 haproxy[3000561]: X.X.X.X:60974 Local_Server~ Local_Server/<NOSRV> -1/-1/-1/ -1 0 0 0 {} "POST /xxxxtransaction HTTP/1.1" 0 0
Dec 22 20:21:27 proxy01 haproxy[3000561]: X.X.X.X:32772 Local_Server~ Local_Server/<NOSRV> -1/-1/-1/ -1 0 0 0 {} "POST /xxxxtransaction HTTP/1.1" 103 0
Dec 22 20:21:27 proxy01 haproxy[3000561]: X.X.X.X:32774 Local_Server~ Local_Server/<NOSRV> -1/-1/-1/ -1 0 0 0 {} "POST /xxxxtransaction HTTP/1.1" 59 0
Dec 22 20:21:27 proxy01 haproxy[3000561]: X.X.X.X:32776 Local_Server~ Local_Server/<NOSRV> -1/-1/-1/ -1 0 0 0 {} "POST /xxxxtransaction HTTP/1.1" 57 0
below is the HAproxy config file:
defaults
log global
timeout connect 15000
timeout check 5000
timeout client 30000
timeout server 30000
errorfile 400 /etc/haproxy/errors/400.http
errorfile 403 /etc/haproxy/errors/403.http
errorfile 408 /etc/haproxy/errors/408.http
errorfile 500 /etc/haproxy/errors/500.http
errorfile 502 /etc/haproxy/errors/502.http
errorfile 503 /etc/haproxy/errors/503.http
errorfile 504 /etc/haproxy/errors/504.http
frontend Local_Server
bind *:80
bind *:443 ssl crt /etc/haproxy/certs/
mode http
option httplog
cookie SRVNAME insert indirect nocache maxidle 8h maxlife 8h
#capture request header X-Forwarded-For len 15
#capture request header Host len 32
http-request capture req.hdrs len 512
log-format "%ci:%cp[%tr] %ft %b/%s %TR/%Tw/%Tc/%Tr/%Ta %ST %B %CC %CS %tsc %ac/%fc/%bc/%sc/%rc %sq/%bq %hr %hs %{+Q}r"
#log-format "%ci:%cp %ft %b/%s %Tw/%Tc/%Tr/ %ST %B %rc %bq %hr %hs %{+Q}r %Tt %Ta"
option dontlognull
option http-keep-alive
#declare whitelists for urls
acl xx_whitelist src -f /etc/haproxy/xx_whitelist.lst
acl is-blocked-ip src -f /etc/haproxy/badactors-list.txt
http-request silent-drop if is-blocked-ip
acl all src 0.0.0.0
######### ANTI BAD GUYS STUFF ###########################################
#anti DDOS sticktable - sends a 500 after 5s when requests from IP over 120 per
#frontend for stick table see backend "st_src_global" also
#Restrict number of requests in last 10 secs
# TO MONTOR RUN " watch -n 1 'echo "show table st_src_global" | socat unix:/run/haproxy/admin.sock -' " ON CLI.
#ZZZ THIS MAY NEED DISABLEING FOR LOAD TESTS ZZZZ
# Table definition
http-request track-sc0 src table st_src_global #<- defines tracking stick table
stick-table type ip size 100k expire 10s store http_req_rate(50000s) #<- sets the limit for and time to store IP
http-request silent-drop if { sc_http_req_rate(0) gt 50000 } # drops if requests are greater the 5000 in 5 secs
# Allow clean known IPs to bypass the filter
tcp-request connection accept if { src -f /etc/haproxy/xx_whitelist.lst }
#Slowlorris protection -send 408 if http request not completed in 5secs
timeout http-request 10s
option http-buffer-request
# Block Specific Requests
#http-request deny if HTTP_1.0
http-request deny if { req.hdr(user-agent) -i -m sub phantomjs slimerjs }
#traffic shape
#xxxx.xxxx.xx.xx
acl xxxxx.xxxxx.xx.xx hdr(host) -i xxxx.xxxx.xx.xx
use_backend xxxx.xxxx.xx.xx if xxxx.xxxx.xx.xx xx_whitelist #update from proxys
#sticktable for dos protection
backend st_src_global
stick-table type ip size 1m expire 10s store http_req_rate(50000s)
backend xxxxxxx.xxxxx.xx.xx
mode http
balance roundrobin
option forwardfor
http-request set-header X-Forwarded-Port %[dst_port]
http-request add-header X-Forwarded-Proto https if { ssl_fc }
server web01-http x.x.x.x:80 check maxconn 100
server web03-http x.x.x.x.:80 check maxconn 100
I am trying to recreate a working curl command in PowerShell 5.1, but when I run it from the script, it errors out. If I pipe out the constructed command to a text file and run it as-is in a command shell it works. I resorted to attempting curl in PowerShell because I have been unable to get the multipart/form-data to work with the Invoke-RestMethod function. Here is the code I'm using, and the error message is shown below. This is on Windows 10 machine. In a nutshell, this is to upload a zip file to a remote server.
$accessToken = '<Bearer token value from prior API call>'
$inputFile = 'C:\MyFolder1\MyFolder2\MyFile.zip'
$curlCmd = 'C:\Curl\bin\curl.exe'
$uriImport = 'https://api.somecompany.com/import'
$curlArgs = '-X', 'POST',
'--header', '"Content-Type: multipart/form-data"',
'--header', '"Accept: application/json"',
'--header', -join('"Authorization: Bearer ', $accessToken, '"'),
'--form', -join('"files=#', $inputFile, ';type=application/zip"'),
-join('"', $uriImport, '"'), '-s'
Write-Host "$curlCmd $cURLargs"
"$curlCmd $curlArgs" | Out-File 'C:\MyFolder\MyFolder2\Curl_Output.txt'
& $curlCmd $curlArgs
The error PoSH returns:
curl.exe : % Total % Received % Xferd Average Speed Time Time Time Current
At C:\CurlTesting.ps1:65 char:9
+ & $curCmd $cURLargs
+ ~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: ( % Total % ... Time Current:String) [], RemoteException
+ FullyQualifiedErrorId : NativeCommandError
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
0 0 0 0 0 0 0 0 --:--:-- 0:00:01 --:--:-- 0
0 0 0 0 0 0 0 0 --:--:-- 0:00:02 --:--:-- 0
0 0 0 0 0 0 0 0 --:--:-- 0:00:03 --:--:-- 0
0 0 0 0 0 0 0 0 --:--:-- 0:00:04 --:--:-- 0
0 0 0 0 0 0 0 0 --:--:-- 0:00:05 --:--:-- 0
0 0 0 0 0 0 0 0 --:--:-- 0:00:06 --:--:-- 0
curl: (6) Could not resolve host: multipart
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
curl: (6) Could not resolve host: application
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
curl: (6) Could not resolve host: Bearer
curl: (3) URL using bad/illegal format or missing URL
I have tried constructing the quotes with both single and double, but it still errors out; the Windows command prompt does require double-quotes. When I run this same constructed script from a command window (cmd.exe), it runs and I don't get all of that output nor the error - just the expected value from the API.
What am I doing wrong?
Here is the modified, slimmed-down version of the code I got working:
$accessToken = '<Bearer token value from prior API call>'
$inputFile = 'C:\MyFolder1\MyFolder2\MyFile.zip'
$uriImport = 'https://api.somecompany.com/import'
curl.exe -X POST `
--header "Content-Type: multipart/form-data" `
--header "Accept: application/json" `
--header "Authorization: Bearer $accessToken" `
--form "filename=#$inputFile;type=application/zip" `
-s
This is what process monitor says is running. It seems ok to me. I don't have that curl but windows 10 (since 1803) comes with curl.exe now. Beware the curl alias in ps 5.
"C:\windows\system32\curl.exe" -X POST --header "Content-Type: multipart/form-data" --header "Accept: application/json" --header "Authorization: Bearer <Bearer token value from prior API call>" --form "files=#C:\MyFolder1\MyFolder2\MyFile.zip;type=application/zip" "https://api.somecompany.com/import" -s
I would run it this way. It's possible that powershell could lose doublequotes to an external command, that you would need to backslash, but this is as much as I can reproduce. It doesn't look like you're running it in silent mode, "-s".
C:\Curl\bin\curl.exe -X POST --header "Content-Type: multipart/form-data" --header "Accept: application/json" --header "Authorization: Bearer $accessToken" --form "files=#$inputFile;type=application/zip" $uriImport -s
I'm trying to perform a refund on Paypal developer account but I keep getting errors while trying to run this command via powershell:
$certpath="E:\AAAA\cert_key.pem"
curl -v -E $certpath -F content=C:\Users\AAA\Desktop\res.xml;type=text/xml" https://api.sandbox.paypal.com/2.0/
The Content of the XML are as below which i took from paypal developer site:
<?xml version="1.0" encoding="UTF-8"?>
<SOAP-ENV:Envelope
xmlns:xsi="http://www.w3.org/1999/XMLSchema-instance"
xmlns:SOAP-ENC="http://schemas.xmlsoap.org/soap/encoding/"
xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/"
xmlns:xsd="http://www.w3.org/1999/XMLSchema"
SOAP-ENV:encodingStyle="http://schemas.xmlsoap.org/soap/encoding/">
<SOAP-ENV:Header>
<RequesterCredentials xmlns="urn:ebay:api:PayPalAPI" SOAP-ENV:mustUnderstand="1">
<Credentials xmlns="urn:ebay:apis:eBLBaseComponents">
<Username>username</Username>
<Password>password</Password>
<Subject/>
</Credentials>
</RequesterCredentials>
</SOAP-ENV:Header>
<SOAP-ENV:Body>
<RefundTransactionReq xmlns="urn:ebay:api:PayPalAPI">
<RefundTransactionRequest xsi:type="ns:RefundTransactionRequestType">
<Version xmlns="urn:ebay:apis:eBLBaseComponents" xsi:type="xsd:string">1.0</Version>
<TransactionID xsi:type="ebl:TransactionId">3P573784GG4876055</TransactionID>
<RefundType>Full</RefundType>
<Memo>Shell script FULL refund example</Memo>
</RefundTransactionRequest>
</RefundTransactionReq>
</SOAP-ENV:Body>
</SOAP-ENV:Envelope>
But i keep getting below error:
curl : * timeout on name lookup is not supported
At line:2 char:1
+ curl -v -E $certpath -F "content=C:\Users\MICHELANGELO\Desktop\res.xml;type=text ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (* timeout on na...s not supported:String) [], RemoteException
+ FullyQualifiedErrorId : NativeCommandError
* Trying 173.0.82.78...
* TCP_NODELAY set
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
* Connected to api.sandbox.paypal.com (173.0.82.78) port 443 (#0)
* schannel: SSL/TLS connection with api.sandbox.paypal.com port 443 (step 1/3)
* schannel: checking server certificate revocation
* schannel: sending initial handshake data: sending 173 bytes...
* schannel: sent initial handshake data: sent 173 bytes
* schannel: SSL/TLS connection with api.sandbox.paypal.com port 443 (step 2/3)
* schannel: failed to receive handshake, need more data
* schannel: SSL/TLS connection with api.sandbox.paypal.com port 443 (step 2/3)
* schannel: encrypted data buffer: offset 4071 length 4096
* schannel: a client certificate has been requested
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
* schannel: SSL/TLS connection with api.sandbox.paypal.com port 443 (step 2/3)
* schannel: encrypted data buffer: offset 4071 length 5095
* schannel: sending next handshake data: sending 365 bytes...
* schannel: SSL/TLS connection with api.sandbox.paypal.com port 443 (step 2/3)
* schannel: encrypted data buffer: offset 91 length 5095
* schannel: SSL/TLS handshake complete
* schannel: SSL/TLS connection with api.sandbox.paypal.com port 443 (step 3/3)
> POST /2.0/ HTTP/1.1
> Host: api.sandbox.paypal.com
> User-Agent: curl/7.51.0
> Accept: */*
> Content-Length: 203
> Expect: 100-continue
> Content-Type: multipart/form-data; boundary=------------------------f4cd70d3c58d2816
>
0 203 0 0 0 0 0 0 --:--:-- 0:00:01 --:--:-- 0
* Done waiting for 100-continue
} [203 bytes data]
* schannel: client wants to read 16384 bytes
* schannel: encdata_buffer resized 17408
* schannel: encrypted data buffer: offset 0 length 17408
* schannel: Curl_read_plain returned CURLE_RECV_ERROR
* schannel: encrypted data buffer: offset 0 length 17408
* schannel: encrypted data buffer: offset 0 length 17408
* schannel: decrypted data buffer: offset 0 length 4096
* schannel: schannel_recv cleanup
* Curl_http_done: called premature == 1
100 203 0 0 100 203 0 92 0:00:02 0:00:02 --:--:-- 92
* Closing connection 0
* schannel: shutting down SSL/TLS connection with api.sandbox.paypal.com port 443
* Send failure: Connection was reset
* schannel: failed to send close msg: Failed sending data to the peer (bytes written: -1)
* schannel: clear security context handle
curl: (56) Send failure: Connection was reset
Any help would be really appreciated.
Is there a command-line tool that could tell me if Gzip is on? What I'm looking for is something that can say the stream coming from the server is really gzipped even if the header params say Gzip:1 (which it could be falsely placing in the headers).
I don't see a switch in curl, or wget, or tcpdump, or anything, but maybe I'm just missing something, or perhaps there is something else that could provide me this bit of information? Any help would be appreciated.
This shows Content-Encoding: gzip indicating compressed data. The data was then in gzip format, otherwise there would have been an error.
$ curl --compressed -v http://zlib.net > /dev/null
* About to connect() to zlib.net port 80 (#0)
* Trying 69.73.181.135... connected
* Connected to zlib.net (69.73.181.135) port 80 (#0)
> GET / HTTP/1.1
> User-Agent: curl/7.19.7 (universal-apple-darwin10.0) libcurl/7.19.7 OpenSSL/0.9.8r zlib/1.2.3
> Host: zlib.net
> Accept: */*
> Accept-Encoding: deflate, gzip
>
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0< HTTP/1.1 200 OK
< Date: Tue, 20 Mar 2012 23:19:00 GMT
< Server: Apache/2.2.21 (Unix) mod_ssl/2.2.21 OpenSSL/0.9.7a mod_auth_passthrough/2.1 mod_bwlimited/1.4 FrontPage/5.0.2.2635
< Last-Modified: Mon, 06 Feb 2012 03:46:25 GMT
< ETag: "29603b0-84b4-4b84381b0a640"
< Accept-Ranges: bytes
< Vary: Accept-Encoding,User-Agent
< Content-Encoding: gzip
< Content-Length: 9508
< Content-Type: text/html
<
{ [data not shown]
100 9508 100 9508 0 0 24955 0 --:--:-- --:--:-- --:--:-- 50574* Connection #0 to host zlib.net left intact
* Closing connection #0