Downloaded zip size from a Github release using curl is only 9 bytes - github

Github public repo has release v1.0.
The following curl command downloads only 9 bytes output of 42KB.
curl -O -L -J --ssl-no-revoke https://github.com/marmayogi/TTF2PostscriptCID-Win/releases/v1.0/TTF2PostscriptCID-Win-1.0.zip
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 9 100 9 0 0 9 0 0:00:01 --:--:-- 0:00:01 9
Based on comments received, the response of curl command only withL flag is added up in the post:
curl -L https://github.com/marmayogi/TTF2PostscriptCID-Win/releases/v1.0/TTF2PostscriptCID-Win-1.0.zip
curl: (35) schannel: next InitializeSecurityContext failed: Unknown error (0x80092012) - The revocation function was unable to check revocation for the certificate.
Added with post based on the comments received.
My desktop was expecting --ssl-no-revoke along with curl command. This problem was resolved with flag k. Here is the evidence.
"C:\Program Files\Neovim\bin\curl.exe" -o TTF2PostscriptCID-Win-1.0.zip -L https://github.com/marmayogi/TTF2PostscriptCID-Win/archive/refs/tags/v1.0.zip
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
curl: (60) SSL certificate problem: unable to get local issuer certificate
More details here: https://curl.haxx.se/docs/sslcerts.html
curl performs SSL certificate verification by default, using a "bundle"
of Certificate Authority (CA) public keys (CA certs). If the default
bundle file isn't adequate, you can specify an alternate file
using the --cacert option.
If this HTTPS server uses a certificate signed by a CA represented in
the bundle, the certificate verification probably failed due to a
problem with the certificate (it might be expired, or the name might
not match the domain name in the URL).
If you'd like to turn off curl's verification of the certificate, use
the -k (or --insecure) option.
Can anyone throw some light on this issue?
Thanks in advance.

I'd recommend using
curl -sL https://github.com/marmayogi/TTF2PostscriptCID-Win/archive/refs/tags/v1.0.zip >filename.zip
or
curl -sLO https://github.com/marmayogi/TTF2PostscriptCID-Win/archive/refs/tags/v1.0.zip
Optionally you can also use (loosens SSL security)
curl -sL --ssl-no-revoke https://github.com/marmayogi/TTF2PostscriptCID-Win/archive/refs/tags/v1.0.zip >filename.zip
or
curl -sLO --ssl-no-revoke https://github.com/marmayogi/TTF2PostscriptCID-Win/archive/refs/tags/v1.0.zip
-s, --silent
Silent or quiet mode. Do not show progress meter or error messages. Makes Curl mute. It will still output the data you ask for, potentially even to the terminal/stdout unless you redirect it.
Use --show-error in addition to this option to disable progress meter but still show error messages.
-L, --location
(HTTP) If the server reports that the requested page has moved to a different location (indicated with a Location: header and a 3XX response code), this option will make curl redo the request on the new place.
-O, --remote-name
Write output to a local file named like the remote file we get. (Only the file part of the remote file is used, the path is cut off.)
The file will be saved in the current working directory.
--ssl-no-revoke
(Schannel) This option tells curl to disable certificate revocation checks. WARNING: this option loosens the SSL security, and by using this flag you ask for exactly that.
The curl then gets redirected to whatever filename you want (filename.zip) or with the -sLO it selects the filename automatically.

seems your URL is bad? try
curl 'https://github.com/marmayogi/TTF2PostscriptCID-Win/archive/refs/tags/v1.0.zip' -LO

Related

flutter doctor: why am I getting a curl error?

I installed the flutter SDK properly, but now flutter doctor isn't working.
$ set | grep SSL
SSL_CA_CERT_FILE=/Users/agoyal3/certs/ca.pem
SSL_CA_CERT_PATH=/Users/agoyal3/certs
SSL_CERT_FILE=/Users/agoyal3/certs/server-crt.pem
SSL_KEY_FILE=/Users/agoyal3/certs/server-key.pem
$ pwd
/Users/agoyal3/temp/flutter/bin
$ ./flutter doctor
Downloading Dart SDK from Flutter engine ead227f118077d1f2b57842a32abaf105b573b8a...
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
curl: (60) SSL certificate problem: unable to get local issuer certificate
More details here: https://curl.haxx.se/docs/sslcerts.html
curl performs SSL certificate verification by default, using a "bundle"
of Certificate Authority (CA) public keys (CA certs). If the default
bundle file isn't adequate, you can specify an alternate file
using the --cacert option.
If this HTTPS server uses a certificate signed by a CA represented in
the bundle, the certificate verification probably failed due to a
problem with the certificate (it might be expired, or the name might
not match the domain name in the URL).
If you'd like to turn off curl's verification of the certificate, use
the -k (or --insecure) option.
Probably your system is lacking valid SSL certs.
Try doing:
Download latest certs from here
Then set this as the SSL_CA_CERT_PATH and the file as SSL_CA_CERT_FILE
Hope that helps!

Bitbucket Server Api: Clone all repositories

I want to clone all repositories in my Bitbucket Server in order to have backups.
I trying to use Bitbucket rest api but not getting the required list of all repositories.
$ curl -u username:password https://servername:9090/rest/api/1.0/projects/~username/repos -k
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 61 0 61 0 0 230 0 --:--:-- --:--:-- --:--:-- 230{"size":0,"limit":25,"isLastPage":true,"values":[],"start":0}
Is anything I am missing in command.
I have gone through the Bitbucket rest api doc but and using same command from there but not getting the required result.
Note that the "projects/~USERNAME/repos" REST resource lists PERSONAL repositories only. Do you have any PERSONAL repository?
You might be better off using a supported backup option for the whole instance, then you preserve more than just the repositories.
https://confluence.atlassian.com/bitbucketserver/data-recovery-and-backups-776640050.html
If you're set on cloning each repo, you'll need to iterate over both projects and personal repos as suggested in another answer, and do so as a user with access to all of those repos (eg a sysadmin account).

How to check HTTP response code in zabbix?

I have a Zabbix server 2.2 and a few linux hosts with websites. How can I get a notification from Zabbix, if the HTTP(s) response code is not 200?
I've tried those triggers without any success:
{owncloud:web.test.rspcode[Availability of owncloud,owncloud availability].last(,10)}#200
{owncloud:web.test.error[Availability of owncloud].count(10,200)}<1
{owncloud:web.test.error[Availability of owncloud].last(#1,10)}=200
But nothing works. I never got an notification, that the code is not 200 anymore even it was 404, because I have renamed the index.php of owncloud to index2.php
I configured the Application and the we the Web Scenario as followed:
if you have already configured the host go to step 1
1) Select the host by Configuration-> Host groups -> select host (example server 1)
2) Go to Config > Hosts > [Host Created Above] > Applications and click on Create Application
3) Now you have to create the Web scenario with the status code check, in my case I checked status code 200. So go to Configuration > Hosts > [Host Created Above] > Web Scenarios and click on Create Web Scenario .
Remark: you have to select the previous application created at the step 2
4) After that without click on Add button go to Steps window and you have to configure the host and parameters for the chek. After that click on Add. In my cas e check the status code 200 response for the HTTP request.
I found the issue. You need to specify the URL to check with file. For example like this in your web scenario:
https://owncloud.example.com/index.php
"Note that Zabbix frontend uses JavaScript redirect when logging in, thus first we must log in, and only in further steps we may check for logged-in features. Additionally, the login step must use full URL to index.php file." - https://www.zabbix.com/documentation/2.4/manual/web_monitoring/example
I also used following expression as trigger:
{owncloud:web.test.fail[Availability of owncloud].last()}>0
you have set a triggers bye Expression
{host name:web.test.rspcode[Scenario name,Steps name].last()}=200
The question has been answered adequately, but I will provide a very much more advanced solution that you could use for all HTTP status codes.
I've created an item that monitors all HTTP status codes of a proxy, graphs them, and then set up multiple different types of triggers to watch last value and counts in last N minutes.
The regex I used to extract all the values from a Nginx or Apache access log is
^(\S+) (\S+) (\S+) \[([\w:\/]+\s[+\-]\d{4})\] \"(\S+)\s?(\S+)?\s?(\S+)?\" (\d{3}|-) (\d+|-)\s?\"?([^\"]*)\"?\s?\"?([^\"]*)\"?\s
I then set many triggers relevant for my particular situation
101 Switching Protocols
301 Moved Permanently
302 Redirect
304 not modified
400 Bad Request
401 Unauthorised
403 Forbidden
404 Not found
500 Server Error
It's also important that your Zabbix agent has permissions to read the log file on the host. You can add the zabbix-agent to the www-data group using this command.
$ sudo usermod -a -G www-data Zabbix
See the tutorial for all the steps in greater detail.

will mongrel be blocked when uploading a huge file?

I believe that Mongrel is a single thread web server. So I suppose it will be blocked if user is uploading a huge file.
However, I did a test today, it seems not true.
I uploaded a file with curl like this:
time curl -k -F myfile=#/tmp/CGI.19974.3 -H 'LOGIN_NAME:admin' -H 'PASSWORD:pass' http://10.32.119.155:3000 -v
Here is the result:
real 6m38.756s
user 0m0.232s
sys 0m9.561s
You can see that this uploading cost 6 minutes. But during this period, the mongrel works well, it can handle the request correctly.
So, Can I say that there is another thread to handle the uploading?

wget can't download - 404 error

I tried to download an image using wget but got an error like the following.
--2011-10-01 16:45:42-- http://www.icerts.com/images/logo.jpg
Resolving www.icerts.com... 97.74.86.3
Connecting to www.icerts.com|97.74.86.3|:80... connected.
HTTP request sent, awaiting response... 404 Not Found
2011-10-01 16:45:43 ERROR 404: Not Found.
My browser has no problem loading the image.
What's the problem?
curl can't download either.
Thanks.
Sam
You need to add the referer field in the headers of the HTTP request. With wget, you just need the --header arg :
wget http://www.icerts.com/images/logo.jpg --header "Referer: www.icerts.com"
And the result :
--2011-10-02 02:00:18-- http://www.icerts.com/images/logo.jpg
Résolution de www.icerts.com (www.icerts.com)... 97.74.86.3
Connexion vers www.icerts.com (www.icerts.com)|97.74.86.3|:80...connecté.
requête HTTP transmise, en attente de la réponse...200 OK
Longueur: 6102 (6,0K) [image/jpeg]
Sauvegarde en : «logo.jpg»
I had the same problem with a Google Docs URL. Enclosing the URL in quotes did the trick for me:
wget "https://docs.google.com/spreadsheets/export?format=tsv&id=1sSi9f6m-zKteoXA4r4Yq-zfdmL4rjlZRt38mejpdhC23" -O sheet.tsv
You will also get a 404 error if you are using ipv6 and the server only accepts ipv4.
To use ipv4, make a request adding -4:
wget -4 http://www.php.net/get/php-5.4.13.tar.gz/from/this/mirror
I had same problem.
Solved using single quotes like this:
$ wget 'http://www.icerts.com/images/logo.jpg'
wget version in use:
$ wget --version
GNU Wget 1.11.4 Red Hat modified
Wget 404 error also always happens if you want to download the pages from Wordpress-website by typing
wget -r http://somewebsite.com
If this website is built using Wordpress you'll get such an error:
ERROR 404: Not Found.
There's no way to mirror Wordpress-website because the website content is stored in the database and wget is not able to grab .php files. That's why you get Wget 404 error.
I know it's not this question's case, because Sam only wants to download a single picture, but it can be helpful for others.
Actually I don't know what is the reason exactly, I have faced this like of problem.
if you have the domain's IP address (ex 208.113.139.4), please use the IP address instead of domain (in this case www.icerts.com)
wget 192.243.111.11/images/logo.jpg
Go to find the IP from URL https://ipinfo.info/html/ip_checker.php
I want to add something to #blotus's answer,
In case adding the referrer header does not solve the issue, May be you are using the wrong referrer (Sometimes the referrer is different from the URL's domain name).
Paste the URL on a web browser and find the referrer from developer tools (Network -> Request Headers).
I met exactly the same problem while setting up GitHub actions with Cygwin. Only after I used wget --debug <url>, I realized that URL is appended with 0xd symbol which is \r (carriage return).
For this kind of problem there is the solution described in docs:
you can also use igncr in the SHELLOPTS environment variable
So I added the following lines to my YAML script to make wget work properly, as well as other shell commands in my GHA workflow:
env:
SHELLOPTS: igncr