I have a Magento installation running on a VPS that runs on CentOS. I've been trying to implement a backup solution using the script found in here: https://github.com/zertrin. It worked fine and my next step was to automate it. Despite all my efforts the Cron job is not running. Here is what I have got in /etc/crontab:
* 20 * * * root echo "Cron Worked $(date)" >> /tmp/cronworked.txt
#
* 16 * * 1-6 root /root/duplicity-backup.sh -c /etc/duplicity-backup.conf -b
#
* 4 * * 7 root /root/duplicity-backup.sh -c /etc/duplicity-backup.conf -f
#
* 20 * * 7 root /root/duplicity-backup.sh -c /etc/duplicity-backup.conf -n
#
* 20 * * * root echo "Cron Worked $(date)" >> /tmp/cronworked3.txt
Both my test cron jobs (first one and the last one), work fine, but not those commands in the middle. They work fine if I issues them as standalone commands but for some reason not as Cron jobs.
Anyone can guide me to figure out what this is not working?
There a couple of things you can check:
Make sure /root/duplicity-backup.sh is executable
If you have a local mail server configured, you should receive an email about the output of the cron jobs, which might tell you what's going wrong
If you don't receive emails from the cron job, then redirect stdout AND stderr to a file. That should help figuring out what's going wrong
Add bash in front of the script name to make sure it's running with bash and not something else, somehow, like this:
* 4 * * 7 root bash /root/duplicity-backup.sh -c /etc/duplicity-backup.conf -f
Having the script output and error messages should help. If they don't please paste them here.
Related
I'm new in Ubuntu and programming.
I'm testing a program that I found on github, to download and import OSM data into postgis.
It works when I run it from terminal (url and name are fake):
make all NAME=dbname URL='http://myurl'
using postgres user.
Now I need to run this command every day.
So I wrote that script:
#!/bin/bash
# go to the directory with Makefile
cd /PCuserhome/directory/to/Makefile/
# run Makefile
make all NAME=dbname URL='http://myurl'
and it works when i run it from terminal.
So I have added it to crontab (of postgres user) in this way:
0,15,30,45 * * * * /PCuserhome/myscript.sh
It create the db but probably fail in running osmosis selection (Osmosis is in the path for all users).
Any idea to solve this? Thank you!
crontab commands are executed only with minimal environment variables, i.e.
PATH=/usr/bin:/bin (on debian anyway),
so if you are relying on programs that are in your $PATH, it will fail.
Consider specifying an absolute path to the osmosis program wherever it's called from.
Alternatively you can change $PATH itself in your script
export PATH="/my/bin:$PATH"
p.s.: you can check the environment by adding a simple cron job
* * * * * env > /tmp/env.txt
I have a raspberry pi with NOOBS. I am trying to run a script every 5 minutes with crontab. It is not working. So I made a test command and added it to my crontab file.
I typed "crontab -e"
then added "* * * * * date >> /Documents/crontab logs/crontab_test_log.txt"
My understanding is that this should get the date and time every minute then save it to a test log file. After rebooting the pi and waiting for 10s of minutes nothing is happening. What am I doing wrong?
Thanks for your help.
I am able to execute scrip from command line.
I'm executing it like this:
/path/to/script run
But while executing it from cron like below, the page is not comming:
55 11 * * 2-6 /path/to/script.pl run >> /tmp/script.log 2>&1
The line which is getting a webpage uses LWP::Simple:
my $site = get("http://sever.com/page") ;
I'm not modyfing anything. The page is valid and accessible.
I'm getting enpty page only when I execute this script from crontab. I am able to execute itfrom command line!
Crontab is owned by root. And job is executed as root.
Thanks in advance for any clue!
It's difficult to say what might be causing this, but there are differences between your environment, and the environment created by crontab.
You could try running it through a shell with appropriate args to construct your user environment:
55 11 * * 2-6 /bin/tcsh -l /path/to/script.pl run >> /tmp/script.log 2>&1
I'm assuming you are running it by cron with your own user ID of course. If you aren't, then obviously you should try running it manually with the user ID that cron is using to run it.
If it's not a difference in environment variables (e.g. those that specify a proxy to use), I believe you are running afoul of SElinux. Among other things, it prevents background applications (e.g. cron jobs) from accessing the internet unless you explicitly allow them to do so. I don't know how to do so, but you should be able to find out quite easily.
I have a shell script which will in-turn invokes a perl script. THe perl script has a mail sending functionality. The script runs very well when i run it manually from command prompt and delivers the mail also, while when scheduled from crontab the shell and perl both runs as per the log, but mail is not getting delivered.
Please find below code snippet
Shell Script :rmail.sh
#!/bin/sh
. /home/pm_prod/.bash_profile
export PATH=$PATH:/home/orapps/client/oracle/product/10.2.0/client_1/bin:/usr/kerberos/bin:/data2/software/oracle/product/10.2.0/client_1/bin:/usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin:/home/pm_prod/bin/:
perl /home/pm_prod/PM/bin/ALERT/rmail.pl
Shell Script :rmail.pl
#!/usr/bin/perl -w
use strict;
use Mail::Sender;
# Send the file. Change all variables to suit
my $sender = new Mail::Sender
{
smtp => 'some.smtpserver.com',
from => 'somename#somedomain.com'
};
$sender->MailFile(
{
to => 'somename#somedomain.com',
subject => 'File',
msg => "Here is the data.\n",
file => '/home/pm_prod/PM/bin/ALERT/attachement_file.txt',
});
CronTab Entry
* * * * * sh /home/pm_prod/PM/bin/ALERT/rmail.sh
Please help me
Try this in cron:
* * * * * /bin/sh /home/pm_prod/PM/bin/ALERT/rmail.sh
Or
* * * * * /usr/bin/sh /home/pm_prod/PM/bin/ALERT/rmail.sh
The issue is with the environment variables, we have to make sure to have all the environment variable when run manually be applicable from the crontab as well. I have followed below steps to achieve that
1. Get the current ENV variables from normal user and put them in to a file
env > /home/pm_prod/workspace/pmenv
2. Copy the content of pmenv file to my rmail.sh script
3. Now schedule rmail.sh script in crontab.
Note : If you are too tired to test the script in crontab, you an optionally try to create a cron type environment with below command and test them before actually scheduling them as mentioned in point 3
* * * * * env > /home/pm_prod/workspace/cronenv
env - `cat /home/pm_prod/workspace/cronenv` /bin/sh
Raghu
I am using CakePHP 1.3 and I was able to successfully able setup the cron job to run shells using the example that was given in the CakePHP Book.
*/5 * * * * /full/path/to/cakeshell myshell myparam -cli /usr/bin -console /cakes/1.2.x.x/cake/console -app /full/path/to/app >> /path/to/log/file.log
This outputs the results into a log file but I want to receive email when there is an error so I can try to resolve the problem.
I tried the following with no luck.
If I remove the >> /path/to/log/file.log then even the successful run is emailed.
> /dev/null, my assumption was it would send a successful to /dev/null and error to email.
1> /dev/null, tried another variation of 2
Any help is appreciated.
Thanks
Huseyin,
This is not a CakePHP error then, and is maybe a question better suited for serverfault, as you would script your solution.
Bash's built-in facilities are up to the task, try The linux documentation project's neat introductory tutorials on shell scripting and #man bash.
Your solution basically has to use a temporary file or variable in which you store the output of the last cron job run. If there is an error:
cat THE_TMP_FILE | mail -s "Error from Server Huseyin's server" huseyin#fancy_domain.com
else:
cat THE_TMP_FILE >> blah.blah.log
Unfortunatly, you need a MTA available, in order to make the mail command. If you do not have access to the mail command, then you set another cron job following the first in time which then simply runs a if [ -e THE_FILE_CONTAINING_THE_LAST_ERROR]; then { echo $(cat THE_FILE_CONTAINING_THE_LAST_ERROR); rm -v THE_FILE... ;} ; fi
Of course this is not working code, but pretty close, so you'll get the idea.