I am trying to put my script file (script.sh) in the /var/lib/cloud/scripts/per-boot/ directory, so that the cloud-init runs the script every time my VPS boots.
But the problem is the script is not getting executed when the VPS boots. Not even once.
My script file goes like this(this is just a test file)...
#!/bin/sh
echo "Hello User !!!"
mkdir /tmp/test
Is there any mistake I am doing ?
Thanks for your help !
Related
I have a perlscript file was running fine in crontab but suddenly it stopped running without any modification.
cd /home/user/public_html/crons && ./script.pl 2>&1 >/dev/null
The top of the script file is #!/usr/bin/perl -X
The output expect from this script is changes in database
I have another script file with the same modification and still works fine
When I run the file in the browser it works fine and execute all lines without any problem
I tried full path /usr/bin/perl but it didn't work
I tried Perl at the beginning but it didn't work
I run the command from SSH using putty but nothing happened
I checked log file /var/log/cron but no errors at all
I created temporary log file cd /home/user/public_html/crons/script.pl> /tmp/temp.log 2>&1 to see the errors but the log is empty
Here is the solution:-
I found the issue, There is was a stuck process for the same cron file , so i killed this process and its fixed
You can find your file process like this
ps aux | grep 'your cron file here'
This is a really common antipattern people seem to tend toward with cron.
Cron sends you an email with the output of your script, if it generates any output. People often redirect output to /dev/null to prevent cron from sending the email. This is bad because now the output of your script is lost entirely. Even if the script has some built-in logging, it might generate errors before it gets the log file opened and those are lost. It also might crash in a way that doesn't get written to the logging mechanism.
At a bare minimum, you should just remove 2>&1 >/dev/null to start receiving the email. (and also, test your mail setup using a temporary cron job like 1 * * * * echo "Test" )
The next better solution is to change it to >> /var/log/myscript/current.log and then also set up something to rotate the log files (like logrotate) and also make sure to create that directory with permissions that the script user is allowed to write to it. By only redirecting STDOUT of the script, any errors or warnings it writes to STDERR cause you to get an email, and if there are no errors/warnings the output goes to the log file and no email gets sent.
Neither of those changes solve the root problem though, which is that when cron runs your script it does so with a different environment than you have on the command line. What you really want is a way to run the script with a consistent environment, and log it. The "ultimate solution" is to define your task in some kind of service manager, and then use cron to occasionally start it. For instance, you could use systemd and define a service that doesn't restart, then use systemctl start my_custom.service in your cron job. Now you can test independent of cron, and your tests will have the same exact environment, and be logged by the service manager. As extra bonuses, you are protected from accidentally running your script twice at once, and you get a clean way to stop a running cron job without the danger of stale pid files.
I don't particularly advocate systemd myself, but thankfully there are lots of alternatives:
Runit : http://smarden.org/runit/runsvdir.8.html
S6 : https://skarnet.org/software/s6/
Perp : http://b0llix.net/perp/site.cgi?page=perpd.8
(but installing and configuring a service manager is a bigger task than just using systemd if your distro is based on systemd) Each of these allows you to define a service that doesn't restart. Then you use a shell command to issue a "run once" directive to the supervisor, which runs the task as a child. Now you can easily launch the jobs yourself and see all the errors in the log, and then add that command to the crontab and know that it will run identically when cron starts it.
Back to your original problem, once you get some logging you are likely to discover it is a permission problem or a upgraded module in the system perl.
My path to the perl script is
public_html/Staging/ff/sendmail.pl
The command to run the perl script in cron job cpanel is
cd /public_html/Staging/ff; perl sendmail.pl >/dev/null
is set to every 1 min, which is not working. What could be the issue? And how to stop the same process
While my answer is extremely late to the party;
As of CPanel 102.. (and for various releases before this) The reason the cronjob does not run is that CPanel crons do not "queue" commands; you can only run one command at a time per cron.
Therefore:
perl /public_html/Staging/ff/sendmail.pl >/dev/null
will work as your cron command, using absolute pathing and removing the need to change directory.
I am new to using cygwin and don't really understand how the scripting of it works. Currently I am running it on Windows 7 and using task scheduler to do this inefficiently.
What I want to do is to run a .bat file already made that runs tests in the cmd line and than take the results of that test and email that people.
Some side notes:
1. It doesn't HAVE to be a batch file, from my reading I think maybe a .sh would be easier to run with bash. Being able to run it on CentOS would be even better, that way others can run if I leave.
2. This needs to run daily. I would like to run the batch file at around 10 am and give it an hour till the emailed results are sent, unless you can trigger the email when the .bat is done.
3. Every time I run this .bat file it saves the results to a .htm file and overwrites it every time the .bat is run.
Thank you
That could be in the crontab for a a centOS server (/etc/crontab)
0 10 * * * user cd /path/ && /bin/bash file.sh >> result_file
Is that what you needed ? Also, you can install Cron as a windows service with cygrunsrv
TASK TO BE ACCOMPLISHED:
To schedule a perl script which is executed on a specific time / day in a week
THINGS I HAVE DONE:
In a schedule Tasks, I have created a new Task by which the Task will call a batch file with below contents
cd "DRIVE\FOLDER\Hummingbird\Connectivity\14.00\Exceed\"
ABCD.xs
cd mDrive/bin
perl baseline.pl -publish -location XXX -email
THINGS NOT WORKING FOR ME / CAUSING THE ISSUE:
Wen I run the scheduler, the prompt opens up the ABCD.xs exceed file window seperately file but the below commands are executed in the command pronpt itself
EXPECTED OUTPUT:
I want the commands
cd mDrive/bin
perl baseline.pl -publish -location XXX -email
to be executed in the exceed window
Any kind of solution wud be great
Thanks in advance.
Haresh
Sounds like you need to start getting into either SendKey stuff (Win32 packages) or else look into writing Exceed/Hummingbird scripts and just executing those.
Some other things to look into... does the remote server have a telnet or ssh server running? Or are there other methods of executing code on the remote server?
For example, my work's mainframe is accessed via a Hummingbird terminal emulator, but I can also telnet to the mainframe and execute commands as well as FTP batch job directly into the JES spool. So when I execute things on the mainframe by way of my PC (Perl scripts, etc.), I don't even fool with Hummingbird.
Good luck...
Hopefully a quick question. I have a .sh script running a php script - the php takes some time to complete and I want the .sh script to proceed.
Is that possible? And if so, how so?
Have you tried php somescript.php & ? The & at the end causes the sh script to continue executing.
If you want the php script to outlive the shell script, try this:
nohup php somescript.php >/dev/null 2>&1 &