Busybox wget strange behaviour - wget

I need some help with a really strange problem.
I use wget (busybox) to obtain the the IP address of some remote clients, to use it on a DNS (a sort of "homemade ddns"). Those clients run a script that every 5 mins calls
wget -O /dev/null "https://my_dns.org/poll.php?user=User_N&pwd=password_N"
Everything was fine, until I updated my http server to remove TLS1.0/TLS1.1
After updating: running the above command on the clients' console it still works OK, while running it automatically (launching it from a script in /etc/init.d) I get this error:
Connecting to my_dns.org (www.xxx.yyy.zzz:443)
wget: error getting response: Connection reset by peer
...Any idea about why does this happen, and how to fix...?
(The shell on the clients runs as root...)
Thank you in advance for your help
Regards

Related

Secure Socket Agent Proxies listener Error

I try to run the server for ssa by using ssaserver& command but I face this error on my server machine
from yesterday.
Error creating listening socket at [host name : port number] the network address may be in use.
I really don't know why it is showing this error and how can be the address in use.
My OS is RedHat7.
That is really happen because you maybe closed your terminal or logged out from current user account in RedHat.
To solve this issue you can use this command here:
ps -ef|grep ssaserver
Which will show you if the service ssaserver already in use and from this you can get the service id and child ids.
You can then kill the service id by using kill command:
kill #(serviceid)
Here is a link to show you how to use kill command: Link_1
But you can face this problem again and again so I think it will be fine if you will use nohup command to run the service you need.
Note: nohup is a command used to run a process(job) on a server and have it continue after you have logged out or otherwise lost connection to the server
Such as:
nohup ssaserver&
Here is extra link for nohup examples: Link_2

Problem creating REST API With JSON Server

I am going through the tutorial: https://www.youtube.com/watch?v=x3NAo8zqdmo
to set up json server. I am able to install it, however when I run the command json-server --watch db.json, I keep getting error
events.js:183
throw er: //unhandled 'error' event
I googled the error, there is nothing running on port 3000 . I even rebooted my machine but error did not go away.
I reinstalled json-server using my company's proxy settings to go behind proxy settings since I had to do that for angular cli installation, but still error is not going away. Any tips?
I was able to resolve the issue by changing the port with command
json-server -p 4000 db.json

rsync: failed to connect to hgdownload.cse.ucsc.edu (128.114.119.163)

For reasons I don't understand, rsync does not work for me since today.
I tried to fix the problem by following steps described here : https://askubuntu.com/questions/628102/15-04-rsync-connection-refused
but it did not work...
I am working on a laptop with Ubuntu 16.04 as OS, and have a wired ethernet connection.
I made several queries on the UCSC server yesterday, so maybe their server blocks my connection ?
Example of a query I would like to do :
rsync -a -P rsync://hgdownload.cse.ucsc.edu/goldenPath/hg38/database/cytoBand.txt.gz ./
Here is the Error message :
rsync: failed to connect to hgdownload.cse.ucsc.edu (128.114.119.163): Connection refused (111)
rsync error: error in socket IO (code 10) at clientserver.c(128) [Receiver=3.1.1]
Tell me what you think and if there is any solution to make it work again.
Thank you in advance for your help.
Edit:
UCSC answered me : no problem on their side. The problem definitely come from me. Still looking for a solution.
It looks like the rsync daemon is not running on the remote host, or perhaps it is running on a non-standard port (default is 873).
It's also possible that connections are being blocked by a firewall. Were your earlier successful connections made from the same location, or are you now testing from elsewhere?
As a workaround you can access the files via HTTP using any browser or HTTP command line tools such as curl, wget et. al. Either of these should work:
$ curl http://hgdownload.cse.ucsc.edu/goldenPath/hg38/database/cytoBand.txt.gz -o cytoBand.txt.gz
$ wget http://hgdownload.cse.ucsc.edu/goldenPath/hg38/database/cytoBand.txt.gz
Since rsync is still not working (I tweet to UCSC to ask about it),
I decided to use the tools curl and wget that mhawke advised me, but using the ftp adresses instead of http (slower) :
curl ftp://hgdownload.cse.ucsc.edu/goldenPath/hg38/database/cytoBand.txt.gz -o cytoBand.txt.gz
wget http://hgdownload.cse.ucsc.edu/goldenPath/hg38/database/cytoBand.txt.gz
If anyone has a solution for the rsync problem, or any informations saying that the problem could come from the UCSC server, I would be thankful.
Cheers !
Edit:
I received an answer from UCSC : no problem about rsync on the side of the UCSC server. So the problem definitely comes from me. I tried on 3 different computers on different places, all running under Ubuntu 16.04. I am still looking for a solution.
Thanks to a mail answer from a guy working at UCSC:
I tried the same command line but on the mirror site : ftp://hgdownload-sd.cse.ucsc.edu.
which gave this command line :
rsync -a -P rsync://hgdownload-sd.cse.ucsc.edu/goldenPath/hg38/database/cytoBand.txt.gz ./
I tried it. It worked.
And here comes the best part: in my case I try the command line that did not work... and it works again:
rsync -a -P rsync://hgdownload.cse.ucsc.edu/goldenPath/hg38/database/cytoBand.txt.gz ./
So I did not had any explanations on where does the problem came from. But problem solved!

Rye::Box commands failing on remote server

Firstly, I can ssh into the remote server and execute the following commands
cd public_html
du -sh
each successful & exiting with code 0.
Automating the process with Rye::Box & with option safe: false
rbox.cd :public_html
does change directory but also returns exit code -1
rbox.execute 'du -sh'
fails with error message "SocketError::getaddrinfo: Name or service not known"
Would appreciate an explanation if possible.
Check your hosts entry for 127.0.0.1
You might have to add a hostname in /etc/hosts for 127.0.0.1.
A similar question addresses this issue on SO.
See also
SocketError (getaddrinfo: Name or service not known) - Sunspot/Solr Rails development

Giving matlab code sudo permissions? Unconnectable-connected ftp port

I'm trying to use this code to pull a bunch of data from the ModelNet data base located at vision.cs.princeton.edu I'm using the already written matlab code from the website itself; however, I'm encountering permissions errors whenever I run the code because wget (which the code uses) is located in a restricted directory in the server. Normally I would just use sudo; however, I can't seem to run sudo matlab as a command. My question is does anybody know a way to remotely run matlab code from a server and somehow give it permissions that sudo normally would give? Also, could someone try ftping to vision.cs.princeton.edu at port 80? For some reason I'm able to connect to that port, but the connection seems to be closed and I can't ping that address either I get 100% package loss.
Thanks
Use urlread instead of wget, this should fix your issues.