I am trying to backup postgres databases. I am running a cron job to do so. Issue is that postgres runs under user postgres and I dont think I can run a cron job under ubuntu user. I tried to create a cron job under postgres user and that also did not work. My script, if login as postgres user works just fine.
Here is my script
#!/bin/bash
# Location to place backups.
backup_dir="/home/postgres-backup/"
#String to append to the name of the backup files
backup_date=`date +%d-%m-%Y`
#Numbers of days you want to keep copie of your databases
number_of_days=30
databases=`psql -l -t | cut -d'|' -f1 | sed -e 's/ //g' -e '/^$/d'`
for i in $databases; do
if [ "$i" != "template0" ] && [ "$i" != "template1" ]; then
echo Dumping $i to $backup_dir$i\_$backup_date
pg_dump -Fc $i > $backup_dir$i\_$backup_date
fi
done
find $backup_dir -type f -prune -mtime +$number_of_days -exec rm -f {} \;
if I do
sudo su - postgres
I see
-rwx--x--x 1 postgres postgres 570 Jan 12 20:48 backup_all_db.sh
and when I do
./backup_all_db.sh
it gets backed up in /home/postgres-backup/
however with cronjob its not working, regardless if I add the cron job under postgres or under ubuntu.
here is my cronjob
0,30 * * * * /var/lib/pgsql/backup_all_db.sh 1> /dev/null 2> /home/cron.err
Will appreciate any help
Enable user to run cron jobs
If the /etc/cron.allow file exists, then users must be listed in it in order to be allowed to run the crontab command. If the /etc/cron.allow file does not exist but the /etc/cron.deny file does, then users must not be listed in the /etc/cron.deny file in order to run crontab.
In the case where neither file exists, the default on current Ubuntu (and Debian, but not some other Linux and UNIX systems) is to allow all users to run jobs with crontab.
Add cron jobs
Use this command to add a cron job for the current user:
crontab -e
Use this command to add a cron job for a specified user (permissions are required):
crontab -u <user> -e
Additional reading
man 5 crontab
Crontab in Ubuntu: https://help.ubuntu.com/community/CronHowto
Related
I'm trying to set up automatic backup for postgre database. Postgre running in docker, so my script for backup is:
docker-compose exec postgres -U user database_name | gzip > "/var/server/my_service/data/backup-db/db_backup.sql.gz"
And its working fine, if I run it manually. I wrote the following job for the crontab (every 5 minutes just for testing):
*/5 * * * * cd /var/server/my_service && sh /var/server/my_service/data/backup/backup_script
This command also working great, if i run it manually it create valid DB backups that i can use.
But crontab just create empty archive, without any data. I just cant understand why.
My guess is that the output stream that catches the gzip is normally generated in manual mode, but completely empty when the crontab trying to run command
I thought there were problems with access rights and put the in the root crontab but it didn't help
UPD:
so... problem in backup_script, error in logs says the input device is not a TTY
I tried google it and add -T, but it didn't help as well
Update your /var/server/my_service/data/backup/backup_script with the following:
Prefix the first 3 line in your script:
#!/bin/bash
source ~/.bash_profile
cd /var/server/my_service
#
# rest of your script
#
Your crontab line should be (At 04:44 on every day-of-month):
44 4 */1 * * /var/server/my_service/data/backup/backup_script
Problem
when in tried run sql file in psql shell...
give "No such file or directory" error!
$ ls
config.sql config.yaml
$ sudo -i -u postgres psql
postgres=# \i config.sql
config.sql: No such file or directory
thanks for your reply!
Quick solution:
-i => goes to user's home directory!
as result ./config.sql address is incorrect!
just use
$ psql -U <user_name>
postgres=# \i config.sql
man sudo tells you:
-i, --login
Run the shell specified by the target user's password database entry as a login shell. This means that login-specific
resource files such as .profile, .bash_profile or .login will be read by the shell. If a command is specified, it is passed
to the shell for execution via the shell's -c option.
In particular, that will set your current working directory to the home directory of user postgres.
If you want to avoid that, don't use '-i'.
I want to automate backup of PostgreSQL database using crontab in UNIX. I have tried but it will create 0 bytes backup.
My crontab entry is:
24 * * * * /home/desktop/myscript.sh
and my sh file contains the following code:
pg_dump -U PostgreSQL -d test > b.backup
It will create the file but the file is empty. Is there any solution? Is there any way to solve this question?
Don't assume that any environment variables are set in a cron job; be explicit:
/full/path/to/pg_dump -U postgres -d test > /full/path/to/b.backup
Look for mail in your inbox for failure reports from cron.
You must specify full path to pg_dump
#!/bin/bash
BKPDATE=$(date +%d.%m.%Y-%H:%M:%S)
cd /var/lib/pgsql/12/backups
/usr/pgsql-12/bin/pg_dump -Fc dl_db > DBNAME_$BKPDATE.dmp --verbose 2> LOG_$BKPDATE.log
or you must add PostgreSQL's bin directory to the path like below:
vi /var/lib/pgsql/.pgsql_profile
export PATH=$PATH:/usr/pgsql-12/bin
I use docker-compose which ups a stack.
Relative code:
db:
build: ./dockerfiles/postgres
container_name: postgres-container
volumes:
- ./dockerfiles/postgres/pgdata:/var/lib/postgresql/data
- ./dockerfiles/postgres/backups:/pg_backups
Dockerfile for Postgres:
FROM postgres:latest
RUN mkdir /pg_backups && > /etc/cron.d/pg_backup-cron && echo "00 22 * * * /backup.sh" >> /etc/cron.d/pg_backup-cron
ADD ./backup.sh /
RUN chmod +x /backup.sh
backup.sh
#!/bin/sh
# Dump DBs
now=$(date +"%d-%m-%Y_%H-%M")
pg_dump -h db -U postgres -d postgres > "/pg_backups/db_dump_$now.sql"
# remove all files (type f) modified longer than 30 days ago under /pg_backups
find /pg_backups -name "*.sql" -type f -mtime +30 -delete
exit 0
Cron simply does not launch the script. How to fix that?
FINAL VERSION
Based on #Farhad Farahi answer, below is the final result:
On host I made a script:
#!/bin/bash
# Creates Cron Job which backups DB in Docker everyday at 22:00 host time
croncmd_backup="docker exec -it postgres-container bash -c '/pg_backups/backup.sh'"
cronjob_backup="00 22 * * * $croncmd_backup"
if [[ $# -eq 0 ]] ; then
echo -e 'Please provide one of the arguments (example: ./run_after_install.sh add-cron-db-backup):
1) add-cron-db-backup
2) remove-cron-db-backup'
# In order to avoid task duplications in cron, the script checks, if there is already back-up job in cron
elif [[ $1 == add-cron-db-backup ]]; then
( crontab -l | grep -v -F "$croncmd_backup" ; echo "$cronjob_backup" ) | crontab -
echo "==>>> Backup task added to Cron"
# Remove back-up job from cron
elif [[ $1 == remove-cron-db-backup ]]; then
( crontab -l | grep -v -F "$croncmd_backup" ) | crontab -
echo "==>>> Backup task removed from Cron"
fi
This script adds cron task to host, which launches the script backup.sh (see above) in a container.
For this implementation there is no need to use Dockerfile for Postgres, so relevant part of docker-compose.yml should look like:
version: '2'
services:
db:
image: postgres:latest
container_name: postgres-container
volumes:
- ./dockerfiles/postgres/pgdata:/var/lib/postgresql/data
- ./dockerfiles/postgres/backups:/pg_backups
Things you should know:
cron service is not started by default in postgres library image.
when you change cron config, you need to reload cron service.
Recommendation:
Use docker host's cron and use docker exec to launch the periodic tasks.
Advantages of this approach:
Unified Configuration for all containers.
Avoids running multiple cron services in multiple containers (Better usage of system resources aswell as less management overhead.
Honors Microservices Philosophy.
Based on the Farhad's answer I created a file postgres_backup.sh on the host with the next content:
#!/bin/bash
# Creates Cron Job which backups DB in Docker everyday at 22:00 host time
croncmd_backup="docker exec -it postgres-container bash -c '/db_backups/script/backup.sh'"
cronjob_backup="00 22 * * * $croncmd_backup"
if [[ $# -eq 0 ]] ; then
echo -e 'Please provide one of the arguments (example: ./postgres_backup.sh add-cron-db-backup):
1 > add-cron-db-backup
2 > remove-cron-db-backup
elif [[ $1 == add-cron-db-backup ]]; then
( crontab -l | grep -v -F "$croncmd_backup" ; echo "$cronjob_backup" ) | crontab -
echo "==>>> Backup task added to Local (not container) Cron"
elif [[ $1 == remove-cron-db-backup ]]; then
( crontab -l | grep -v -F "$croncmd_backup" ) | crontab -
echo "==>>> Backup task removed from Cron"
fi
and I added a file /db_backups/script/backup.sh to Docker's Postgres Image with the content:
#!/bin/sh
# Dump DBs
now=$(date +"%d-%m-%Y_%H-%M")
pg_dump -h db -U postgres -d postgres > "/db_backups/backups/db_dump_$now.sql"
# remove all files (type f) modified longer than 30 days ago under /db_backups/backups
find /db_backups/backups -name "*.sql" -type f -mtime +30 -delete
exit 0
I have many .sql files in a folder (/home/myHSH/scripts) in linux debian. I want to know the command to execute or run all sql files inside the folder into postgreSQL v9.1 database.
PostgreSQL informations:
Database name=coolDB
User name=coolUser
Nice to have: if you know how to execute multiple sql files through GUI tools too like pgAdmin3.
From your command line, assuming you're using either Bash or ZSH (in short, anything but csh/tcsh):
for f in *.sql;
do
psql coolDB coolUser -f "$f"
done
The find command combined with -exec or xargs can make this really easy.
If you want to execute psql once per file, you can use the exec command like this
find . -iname "*.sql" -exec psql -U username -d databasename -q -f {} \;
-exec will execute the command once per result.
The psql command allows you to specify multiple files by calling each file with a new -f argument. e.g. you could build a command such as
psql -U username -d databasename -q -f file1 -f file2
This can be accomplished by piping the result of the find to an xargs command once to format the files with the -f argument and then again to execute the command itself.
find . -iname "*.sql" | xargs printf -- ' -f %s' | xargs -t psql -U username -d databasename -q