run and stop perl script from cron job - cpanel - perl

My path to the perl script is
public_html/Staging/ff/sendmail.pl
The command to run the perl script in cron job cpanel is
cd /public_html/Staging/ff; perl sendmail.pl >/dev/null
is set to every 1 min, which is not working. What could be the issue? And how to stop the same process

While my answer is extremely late to the party;
As of CPanel 102.. (and for various releases before this) The reason the cronjob does not run is that CPanel crons do not "queue" commands; you can only run one command at a time per cron.
Therefore:
perl /public_html/Staging/ff/sendmail.pl >/dev/null
will work as your cron command, using absolute pathing and removing the need to change directory.

Related

Perl script file run manually but not in crontab

I have a perlscript file was running fine in crontab but suddenly it stopped running without any modification.
cd /home/user/public_html/crons && ./script.pl 2>&1 >/dev/null
The top of the script file is #!/usr/bin/perl -X
The output expect from this script is changes in database
I have another script file with the same modification and still works fine
When I run the file in the browser it works fine and execute all lines without any problem
I tried full path /usr/bin/perl but it didn't work
I tried Perl at the beginning but it didn't work
I run the command from SSH using putty but nothing happened
I checked log file /var/log/cron but no errors at all
I created temporary log file cd /home/user/public_html/crons/script.pl> /tmp/temp.log 2>&1 to see the errors but the log is empty
Here is the solution:-
I found the issue, There is was a stuck process for the same cron file , so i killed this process and its fixed
You can find your file process like this
ps aux | grep 'your cron file here'
This is a really common antipattern people seem to tend toward with cron.
Cron sends you an email with the output of your script, if it generates any output. People often redirect output to /dev/null to prevent cron from sending the email. This is bad because now the output of your script is lost entirely. Even if the script has some built-in logging, it might generate errors before it gets the log file opened and those are lost. It also might crash in a way that doesn't get written to the logging mechanism.
At a bare minimum, you should just remove 2>&1 >/dev/null to start receiving the email. (and also, test your mail setup using a temporary cron job like 1 * * * * echo "Test" )
The next better solution is to change it to >> /var/log/myscript/current.log and then also set up something to rotate the log files (like logrotate) and also make sure to create that directory with permissions that the script user is allowed to write to it. By only redirecting STDOUT of the script, any errors or warnings it writes to STDERR cause you to get an email, and if there are no errors/warnings the output goes to the log file and no email gets sent.
Neither of those changes solve the root problem though, which is that when cron runs your script it does so with a different environment than you have on the command line. What you really want is a way to run the script with a consistent environment, and log it. The "ultimate solution" is to define your task in some kind of service manager, and then use cron to occasionally start it. For instance, you could use systemd and define a service that doesn't restart, then use systemctl start my_custom.service in your cron job. Now you can test independent of cron, and your tests will have the same exact environment, and be logged by the service manager. As extra bonuses, you are protected from accidentally running your script twice at once, and you get a clean way to stop a running cron job without the danger of stale pid files.
I don't particularly advocate systemd myself, but thankfully there are lots of alternatives:
Runit : http://smarden.org/runit/runsvdir.8.html
S6 : https://skarnet.org/software/s6/
Perp : http://b0llix.net/perp/site.cgi?page=perpd.8
(but installing and configuring a service manager is a bigger task than just using systemd if your distro is based on systemd) Each of these allows you to define a service that doesn't restart. Then you use a shell command to issue a "run once" directive to the supervisor, which runs the task as a child. Now you can easily launch the jobs yourself and see all the errors in the log, and then add that command to the crontab and know that it will run identically when cron starts it.
Back to your original problem, once you get some logging you are likely to discover it is a permission problem or a upgraded module in the system perl.

Not able to execute script from crontab

I am able to execute scrip from command line.
I'm executing it like this:
/path/to/script run
But while executing it from cron like below, the page is not comming:
55 11 * * 2-6 /path/to/script.pl run >> /tmp/script.log 2>&1
The line which is getting a webpage uses LWP::Simple:
my $site = get("http://sever.com/page") ;
I'm not modyfing anything. The page is valid and accessible.
I'm getting enpty page only when I execute this script from crontab. I am able to execute itfrom command line!
Crontab is owned by root. And job is executed as root.
Thanks in advance for any clue!
It's difficult to say what might be causing this, but there are differences between your environment, and the environment created by crontab.
You could try running it through a shell with appropriate args to construct your user environment:
55 11 * * 2-6 /bin/tcsh -l /path/to/script.pl run >> /tmp/script.log 2>&1
I'm assuming you are running it by cron with your own user ID of course. If you aren't, then obviously you should try running it manually with the user ID that cron is using to run it.
If it's not a difference in environment variables (e.g. those that specify a proxy to use), I believe you are running afoul of SElinux. Among other things, it prevents background applications (e.g. cron jobs) from accessing the internet unless you explicitly allow them to do so. I don't know how to do so, but you should be able to find out quite easily.

Use cygwin to run a batch file and email results

I am new to using cygwin and don't really understand how the scripting of it works. Currently I am running it on Windows 7 and using task scheduler to do this inefficiently.
What I want to do is to run a .bat file already made that runs tests in the cmd line and than take the results of that test and email that people.
Some side notes:
1. It doesn't HAVE to be a batch file, from my reading I think maybe a .sh would be easier to run with bash. Being able to run it on CentOS would be even better, that way others can run if I leave.
2. This needs to run daily. I would like to run the batch file at around 10 am and give it an hour till the emailed results are sent, unless you can trigger the email when the .bat is done.
3. Every time I run this .bat file it saves the results to a .htm file and overwrites it every time the .bat is run.
Thank you
That could be in the crontab for a a centOS server (/etc/crontab)
0 10 * * * user cd /path/ && /bin/bash file.sh >> result_file
Is that what you needed ? Also, you can install Cron as a windows service with cygrunsrv

Perl script works but not via CRON

I have a perl script (which syncs delicious to wp) which:
runs via the shell but
does not run via cron (and i dont get an error)
The only thing I can think of is that it read the config file wrongly but... it is defined via the full path (i think).
I read my config file as:
my $config = Config::Simple->import_from('/home/12345/data/scripts/delicious/wpds.ini',
\my %config);
(I am hosted on mediatemple)
Does anybody have a clue?
update 1: HERE is the complete code: http://plugins.svn.wordpress.org/wordpress-23-compatible-wordpress-delicious-daily-synchronization-script/trunk/ (but I have added the path as above to the configuration file location as difference)
update 2: crossposted on https://forums.mediatemple.net/viewtopic.php?pid=31563#p31563
update 3: the full path did the trick, solved
The difference between a cron job and a job run from the shell is 'environment'. The primary difference is that your profile and the like are not run for a cron job, so any environment variables you have set in your normal shell environment are not set the same in the cron environment - no extensions to PATH, no environment variables identifying where Delicious and/or WP are hosted, etc.
Suggestion: create a cron job that simply reports the environment to a known file:
env > /home/27632/tmp/env.27632
Then see what is set in your own shell environment in comparison. Chances are, that will reveal the trouble.
Failing that, other environmental differences are that a cron job has no terminal, and has /dev/null for input and output - so interactive stuff does not work well.
it seems the problem is not in running perl, but locating the Config library
you should try:
perl -e "print #INC"
and run a similar perl script in cron, and read the output
it possible that they differ
I suggest looking at my answer to How to simulate the environment cron executes a script with?
This is an similar Jonathan's answer but goes a bit further.
Based on your crontab, and depending on your installation, the problem might be the "perl". As others note the environment, particularly the $PATH variable, is different for cron. perl may not be in the path so you need to put the full path to perl in the cron command.
You can determine the path with the command $ type perl
I run into the same problem ...
Perl script works but not via CRON => error: "perl: command not found"
... after an update from Plesk 12.0 to Plesk 12.5. But the existing answers were not very helpful for me.
It took some time, but than I found this thread in the Odin forum which helps me: https://talk.plesk.com/threads/scheduled-tasks-always-fail.331821/
They suggest the following:
/usr/local/psa/bin/server_pref -u -crontab-secure-shell ""
That deletes in the /var/spool/cron/crontabs files the line:
SHELL="/opt/psa/bin/chrootsh"
After that, my cron jobs run with out any error.
(Ubuntu 14.04 with Plesk 12.5)
If the perl script runs fine manually, but not from crontab, then
there is some environment path needed by the some package that is not
getting through `cron`. Run your command as follows:
suppose your cron entry like:
* 13 * * * /usr/bin/perl /home/username/public_html/cron.pl >/dev/null 2>&1
env - /home/username/public_html/cron.pl
The output will show you the missing package. export that package path in
$PATH variables

Permission denied (publickey,keyboard-interactive)

Permission denied (publickey,keyboard-interactive) got this error while i am trying to cvs checkout from perl.
what is issue and how to reslove this ?
Code :
system ( "CSVROOT:--- CVSRSH:--- cvs co a ");
# i have proper value in cvs root and cvs rsh .
its running alone and using ssh key
Steps to diagnose the error:
Are you using an SSH key?
Does that key have a passphrase?
Does it work when you run it by hand?
Is the script running as the same user as when you run it by hand?
Is the script running under the same environment as when you run it by hand? (e.g. cron jobs do not run under the same environment)
If you think all of the answers are yes, then most likely the last answer is really no. If the script is running from a scheduler like cron it most likely does not run with the same environment as when you run it by hand. The way I normally solve this is to use a shell script between the scheduler and the Perl script:
#!/bin/bash
source /home/USERNAME/.profile
#set any other environment variables it needs like
export CSVROOT=:pserver:USERNAME#HOST:/path/to/repo
export CVSRSH=ssh
/path/to/perl/script/script.pl
Follow-up investigations after Chas.'s questions:
Does that command normally run under /bin/sh or some other shell?
To test, execute /bin/sh command to start Bourne shell and try the command by hand again.
I'm not familiar with "CVSROOT:---" notation - is that meant to set CVSROOT environmental variable? In Bourne shell it's usually done using "=", never saw ":" used.
Does the command, when run by hand, expect some input from you? I never saw cvs co to do so, but I don't use it with ssh.
Try to add a redirect to the end of the command and look what's in the file after running:
system ( "CSVROOT:--- CVSRSH:--- cvs co a > /tmp/log_cmd 2>&1");