wget giving error when downloading certain files - command-line

I am facing a problem in downloading some documents programmatically.
For example this link
https://www-950.ibm.com/events/wwe/grp/grp019.nsf/vLookupPDFs/Introduction_to_Storwize_V7000_Unified_T3/$file/Introduction_to_Storwize_V7000_Unified_T3.pdf
can be downloaded from browser, but when I try to get it from wget it doesn't work.
I have tried
wget https://www-950.ibm.com/events/wwe/grp/grp004.nsf/vLookupPDFs/3-Mobile%20Platform%20--%20Truty%20--%20March%208%202012/\$file/3-Mobile%20Platform%20--%20Truty%20--%20March%208%202012.pdf
It gave me this output
--2012-04-18 17:09:42--
https://www-950.ibm.com/events/wwe/grp/grp004.nsf/vLookupPDFs/3-Mobile%20Platform%20--%20Truty%20--%20March%208%202012/$file/3-Mobile%20Platform%20--%20Truty%20--%20March%208%202012.pdf
Resolving www-950.ibm.com... 216.208.176.98
Connecting to www-950.ibm.com|216.208.176.98|:443... connected.
Unable to establish SSL connection.
Can any one help me solve this problem. Thanks in advance.

Add the --no-check-certificate to your original wget command.
Plus you need to ensure that you are using a proxy.
On Linux:
export http_proxy=http://myproxyserver.com:8080
On Windows:
set http_proxy=http://myproxyserver.com:8080
I also found that on windows, because this is a https request, that in order to make it work, I also had to set https_proxy. So
set https_proxy=http://myproxyserver.com:8080
Obviously, change the proxy settings to suite your particular situation.

Related

wget does not use option in .wgetrc

I'm trying to specify a specific CA file for use with a proxy. When I use wget --ca-certificate=file.cer, it works fine. But when I try to put ca_certificate = file.cer in $HOME/.wgetrc, it doesn't work and I get the following error:
Unable to locally verify the issuer's authority.
The docs say that these should both do the same thing, so I don't know what is causing the difference.
I'm on SLES 15 SP1 and using GNU Wget 1.20.3.
According to Wgetrc Location manual
If the environmental variable WGETRC is set, Wget will try to load
that file. Failing that, no further attempts will be made.
If WGETRC is not set, Wget will try to load $HOME/.wgetrc.
So first thing to check is if WGETRC is set. If it is and is other than $HOME/.wgetrc then wget will not load latter.
what is causing the difference.
I am not sure in relation to what is wget looking for files, so I would try using absolute path rather than relative.

cannot find a valid baseurl for repo: extra/7/x86_64

I've set up a fresh VM(CeotOS7) in vmware. However, I failed to update yum.
It returns cannot find a valid baseurl for repo: extra/7/x86_64.
I've tried all the solutions on the Internet and still can't fix it.
I can ping google.com, curl http://mirrorlist.centos.org/?release=7&arch=i386&repo=os returns <'http><'head><'title>Firewall Authentication <'/title>
I've also tried to modify CentOS-Base.repo hundreds of times.
Firewall and SELinux both disabled.
You might have already solved problem. It might help other.
I run following commands
[root#myserver]# ONBOOT=no
[root#myserver]# dhclient
[root#myserver]# yum update

rsync: failed to connect to hgdownload.cse.ucsc.edu (128.114.119.163)

For reasons I don't understand, rsync does not work for me since today.
I tried to fix the problem by following steps described here : https://askubuntu.com/questions/628102/15-04-rsync-connection-refused
but it did not work...
I am working on a laptop with Ubuntu 16.04 as OS, and have a wired ethernet connection.
I made several queries on the UCSC server yesterday, so maybe their server blocks my connection ?
Example of a query I would like to do :
rsync -a -P rsync://hgdownload.cse.ucsc.edu/goldenPath/hg38/database/cytoBand.txt.gz ./
Here is the Error message :
rsync: failed to connect to hgdownload.cse.ucsc.edu (128.114.119.163): Connection refused (111)
rsync error: error in socket IO (code 10) at clientserver.c(128) [Receiver=3.1.1]
Tell me what you think and if there is any solution to make it work again.
Thank you in advance for your help.
Edit:
UCSC answered me : no problem on their side. The problem definitely come from me. Still looking for a solution.
It looks like the rsync daemon is not running on the remote host, or perhaps it is running on a non-standard port (default is 873).
It's also possible that connections are being blocked by a firewall. Were your earlier successful connections made from the same location, or are you now testing from elsewhere?
As a workaround you can access the files via HTTP using any browser or HTTP command line tools such as curl, wget et. al. Either of these should work:
$ curl http://hgdownload.cse.ucsc.edu/goldenPath/hg38/database/cytoBand.txt.gz -o cytoBand.txt.gz
$ wget http://hgdownload.cse.ucsc.edu/goldenPath/hg38/database/cytoBand.txt.gz
Since rsync is still not working (I tweet to UCSC to ask about it),
I decided to use the tools curl and wget that mhawke advised me, but using the ftp adresses instead of http (slower) :
curl ftp://hgdownload.cse.ucsc.edu/goldenPath/hg38/database/cytoBand.txt.gz -o cytoBand.txt.gz
wget http://hgdownload.cse.ucsc.edu/goldenPath/hg38/database/cytoBand.txt.gz
If anyone has a solution for the rsync problem, or any informations saying that the problem could come from the UCSC server, I would be thankful.
Cheers !
Edit:
I received an answer from UCSC : no problem about rsync on the side of the UCSC server. So the problem definitely comes from me. I tried on 3 different computers on different places, all running under Ubuntu 16.04. I am still looking for a solution.
Thanks to a mail answer from a guy working at UCSC:
I tried the same command line but on the mirror site : ftp://hgdownload-sd.cse.ucsc.edu.
which gave this command line :
rsync -a -P rsync://hgdownload-sd.cse.ucsc.edu/goldenPath/hg38/database/cytoBand.txt.gz ./
I tried it. It worked.
And here comes the best part: in my case I try the command line that did not work... and it works again:
rsync -a -P rsync://hgdownload.cse.ucsc.edu/goldenPath/hg38/database/cytoBand.txt.gz ./
So I did not had any explanations on where does the problem came from. But problem solved!

Giving matlab code sudo permissions? Unconnectable-connected ftp port

I'm trying to use this code to pull a bunch of data from the ModelNet data base located at vision.cs.princeton.edu I'm using the already written matlab code from the website itself; however, I'm encountering permissions errors whenever I run the code because wget (which the code uses) is located in a restricted directory in the server. Normally I would just use sudo; however, I can't seem to run sudo matlab as a command. My question is does anybody know a way to remotely run matlab code from a server and somehow give it permissions that sudo normally would give? Also, could someone try ftping to vision.cs.princeton.edu at port 80? For some reason I'm able to connect to that port, but the connection seems to be closed and I can't ping that address either I get 100% package loss.
Thanks
Use urlread instead of wget, this should fix your issues.

Increase image upload limt

Whenever I try to upload an Image larger than 125Kb, I receive Upload HTTP Error. How can I increase this limit so I can upload Hi-res images?
Thank you,
FD
This has nothing to do with Magento and everything to do with your server settings.
You will likely have to bump up post_max_size and upload_max_filesize in your php.ini
Also, if you're running NGINX you may also have to increase client_max_body_size
Please note, however, that settings and restrictions can vary greatly from one hosting environment to the next. If you're not sure how to alter the config files properly or do not have the necessary access to do so - then you may have to contact your hosting provider and ask them to do it for you.
First of all, make sure that you have correct permissions for your media dir using command line:
sudo chmod -R 775 [magento_root]/media
If it doesn't help, try to check your php config:
php -i | egrep 'upload_max_filesize|post_max_size|memory_limit'
If you see the small values there, you, probably, need to change limits by editing these limits in your php.ini file. You can find this file by running the following command
php -i | grep php.ini
Also, do not forget to restart your apache/php servers after some config changes have been made. Usually, you are able to do it by running:
sudo /etc/init.d/apache2 restart
or
sudo service apache2 restart
Also, I noticed that sometimes mod_security might cause such kind of issues. Try to check your [magento_root]/.htaccess file for the following configuration and try to add it if it's absent:
<IfModule mod_security.c>
SecFilterEngine Off
SecFilterScanPOST Off
</IfModule>
And, the last thing: try to upload images from another browser/computer. Magento has flash uploader for product images and we have cases when the flash player caused the similar issues on some computers.
You have to change both post_max_size and upload_max_filesize in the php.ini
And don’t forget to restart your server afterwards.