is there a way to dump all postgres database from localhost to remote server without restoration in ubuntu? - postgresql

I want to dump all my PostgreSQL databases directly to a remote server from my localhost, just a simple .sql file and create a cronjob for it. Is this possible ?

Related

Connect to a remote postgresql server using ftp

I exported a database to a folder in a local ftp server (using pg_dump) in order to save some space on my computer. Now I want to connect to the generated db.sql file but without importing it again.
In PgAdmin I've created a new database, opened the psql tool and executed \i /path/todb/db.sql, which imports the database into my computer. However, I just need a remote connection over ftp.
I'll appreciate any help.
Thanks.

How to automate using a production postgres database backup in local Flask environment

We use Postgres and Flask for our website, and we use the production database dump locally pretty often. To get a fresh dump, I use a remote desktop connection (RDC) to connect to pgAdmin then use RDC again to copy .bak file from server and save it locally. Likewise, I use a local instance of pdAdmin to restore the database state from the backup.
My manager asked me to automate this process to use production database each time when a local Flask instance is launched. How can I do that?
You could write a shell script that dumps the database to a local file using pg_dump, then use pg_restore to build a new local database from that dump. You could probably even just pass the output from pg_dump to pg_restore... something like
pg_dump --host <remote-database-host> --dbname <remote-database-name> --username <remote-username> > pg_restore --host <local-database-host> --username <local-username>
To get your password into pg_dump / pg_restore you'll probably want to use a .pgpass file, as described here: How to pass in password to pg_dump?
If you want this to happen automatically when you launch a Flask instance locally, you could call the shell script from your initialization code using a subprocess call if a LOCAL_INSTANCE environment variable is set, or something along those lines

Backup and restore postgresql data folder directly

Till now I've been backing up my postgresql data using pg_dump, which exports the data to an sql file mydb.sql, and then restoring from that sql file using psql -U user -d db < mydb.sql.
For one reason or another it would be more convenient to restore the database content more directly, in an environment where psql does not exist... specifically on a host server where postgresql is installed in a docker container running on the host, but not on the host itself.
My plan is to back up the content of /var/lib/postgresql/data/ to a tar file, and when required (e.g. when a new server is created that hosts the postgresql container) just restore that to the same path. The folder /var/lib/postgresql/data/ in the docker container is mapped to a folder on the host server, so I would create this backup on the host, not inside the postgres container.
Is this a valid approach? Any "gotchas"? And are there any subfolders within /var/lib/postgresql/data/ that I can exclude from the tar file? I don't want to back up mere 'housekeeping' information.
You can do that, but you have to do it properly if you don't want your database to become corrupted.
Either stop PostgreSQL before copying the data directory or follow the instructions from the documentation for an online backup.

How openshift - postgresql load data from local file?

I created app on openshift and added a cartridge postgresql.
There is no manage application like pgadmin supported by openshift.
I manage the DB by PuTTY under windows.
But how I can import the local data to the DB on openshift.
Thank you in advance!
First take backup of your database using below command.
pg_dump dbname > outfile
Then
add the postgresql database like so
rhc-ctl-app –a postgresApp –e add-postgresql-8.4
then access the remote psql and you can restore using this command pg_restore

Remote trigger a postgres database backup

I would like to backup my production database before and after running a database migration from my deploy server (not the database server) I've got a Postgresql 8.4 server sitting on a CentOS 5 machine. The website accessing the database is on a Windows 2008 server running an MVC.Net application, it checks out changes in the source code, compiles the project, runs any DB Changes, then deploys to IIS.
I have the DB server set up to do a crontab job backup for daily backups, but I also want a way of calling a backup from the deploy server during the deploy process. From what I can figure out, there isn't a way to tell the database from a client connection to back itself up. If I call pg_dump from the web server as part of the deploy script it will create the backup on the web server (not desirable). I've looked at the COPY command, and it probably won't give me what I want. MS SQLServer lets you call the BACKUP command from within a DB Connection which will put the backups on the database machine.
I found this post about MySQL, and that it's not a supported feature in MySQL. Is Postgres the same? Remote backup of MySQL database
What would be the best way to accomplish this? I thought about creating a small application that makes an SSH connection to the DB Server, then calls pg_dump? This would mean I'm storing SSH connection information on the server, which I'd really rather not do if possible.
Create a database user pgbackup and assign him read-only privileges to all your database tables.
Setup a new OS user pgbackup on CentOS server with a /bin/bash shell.
Login as pgbackup and create a pair of ssh authentication keys without passphrase, and allow this user to login using generated private key:
su - pgbackup
ssh-keygen -q -t rsa -f ~/.ssh/id_rsa -N ""
cp -a ~/.ssh/.id_rsa.pub ~/.ssh/authorized_keys
Create a file ~pgbackup/.bash_profile:
exec pg_dump databasename --file=`date +databasename-%F-%H-%M-%S-%N.sql`
Setup your script on Windows to connect using ssh and authorize using primary key. It will not be able to do anything besides creating a database backup, so it would be reasonably safe.
I think this could be possible if you create a trigger that uses the PostgreSQL module dblink to make a remote database connection from within PL/pgSQL.
I'm not sure what you mean but I think you can just use pg_dump from your Windows computer:
pg_dump --host=centos-server-name > backup.sql
You'd need to install Windows version of PostgreSQL there, so pg_dump.exe would be installed, but you don't need to start PostgreSQL service or even create a tablespace there.
Hi Mike you are correct,
Using the pg_dump we can save the backup only on the local system. In our case we have created a script on the db server for taking the base backup. We have created a expect script on another server which run the script on database server.
All our servers are linux servers , we have done this using the shell script.