How to COPY local file to remote database - postgresql

I have remote postgresql database and a local csv file which I need to add to the database. I'm trying to do it with PyCharm.
Thus, I'm trying to copy data from a local file to a remote database.
If the database local is, then this command works:
COPY master_relationsextra(code, serial_number, member_type, characteristic, price_list)
FROM '/Users/name/Desktop/AUTOI.csv' with CSV HEADER delimiter ';' encoding 'ISO-8859-1';
But for the remote database it doesn't working.
Any advice how can I do that?

I'm using PyCharm thus I did with PyCharm's help. All queries and commands did PyCharm for me. I did it as follows:
I connected to the remote database from PyCharm database pane
Right click on table and then import from file
Choose all rules and import
That did the trick for me.

Related

Importing a large .sql file into a database - repeated timeout error myPHPAdmin

I have a .sql file (db) which I am trying to import using myphpadmin and keep getting a time out error.
The file is 46.6 MB (zipped)
Please note I am not on XAMPP but using a Godaddy myphpAdmin platform to manage the database.
What I've tried:
Re-downloaded the file as a zip file - and tried importing it. Still failed.
For this option given in phpmyadmin import, I tried UNSELECTING this option > "Allow the interruption of an import in case the script detects it is close to the PHP timeout limit. (This might be a good way to import large files, however it can break transactions.)"....and I also tried importing the db keeping it selected, but this failed. Which should it be?
What else can I do?
nothing worked, except SSH.
What you need:
Database (that you are importing into) username and password
Cpanel username and password + IP address (for Putty)
I had to upload the .sql file to a folder on the public_html.
Download pUtty
In putty I needed the IP address (hosting server) as well as the cpanel username and password (so have that handy).
Once in, you have to enter your cpanel's password
Use the "cd" change directory command to change directory to where you have placed your .sql file.
Once there, use the following command:
mysql -p -u user_name database_name < file.sql*
(Note: replace 'user_name', 'database_name', and 'file.sql' with the actual name.)**
You will be prompted for your database password, and then your database will be imported.
Useful link: https://www.siteground.co.uk/kb/exportimport-mysql-database-via-ssh/
You can try unzipping the file locally and importing the uncompressed .sql file; the overhead of uncompressing the file in memory could be the problem for phpMyAdmin. Generally, though, what Shadow said is correct and you should use some other means for import (like the command-line client). You could also use the phpMyAdmin UploadDir feature to put the file on the file in a special folder that phpMyAdmin can directly access on the server. This can help with a lot of the resource limits the webserver imposes.

Connect to a remote postgresql server using ftp

I exported a database to a folder in a local ftp server (using pg_dump) in order to save some space on my computer. Now I want to connect to the generated db.sql file but without importing it again.
In PgAdmin I've created a new database, opened the psql tool and executed \i /path/todb/db.sql, which imports the database into my computer. However, I just need a remote connection over ftp.
I'll appreciate any help.
Thanks.

Postgresql:exporting data from local database to remote server in csv

I want to send data from my local postgresql to a remote server in csv form..
something similar to
copy (select * from table) to /home/ubuntu/a.csv with csv
But in place of local direcory, I want to take this csv dump in other server
Use the psql client's \copy feature; this does exactly what you want.
As far as I know copy command reads/writes from local path/location. To generate the output file on the remote server, you need to use a script(bash/python) to execute copy command on remote server.

How to Import DB2 full dump

I have a DB2 v9.7 Dump(.gz format) which i need to import to an another DB2 database of same version.
All the tables needs to be imported in one go.
Can somebody help me in how to achieve this ?
Thankyou in adavnce.
-Nitika
First, the DB2 backups do not have that name structure. You should have a file inside that .gz that should have a name like this
SAMPLE.0.db2inst1.NODE0000.CATN0000.20131224235959.001
It gives the database name, the backup type; the instance that host the database; the node (when using DPF); the timestamp; and the file number.
Normally, it just change the timestamp. And in order to restore the db you should go to the directory where the file is, and then just type:
db2 restore db sample
Eventually, if it does not work, you should specify the timestamp, directory or other things:
db2 restore db sample from /dir taken at 20131224235959
If you change the instance, you should rebind some packages. Also, you should be sure that the security structure is the same in the new installation (/etc/passwd and /etc/group have the same users and groups used in DB2)
For more information, please check: http://pic.dhe.ibm.com/infocenter/db2luw/v10r5/topic/com.ibm.db2.luw.admin.ha.doc/doc/c0006237.html
You can use db2move command
db2move sample export
db2move sample import
where sample is the database name.
If you are having .dmp file then you can use below commands import .dmp file.
If you have dmp file in tar or zip you need to extract this.
db2 –c- -svtf db2dump.dmp > log.txt
Note:
It is different then: restore command as below :
restore db from Path_of_the_backup_file.
eg: restore db QAST from C:\Backups\Backup_location
backup db to C:\Backups\Backup_location.
eg: restore db QISST from C:\Backups\Backup_location

Can PostgreSQL COPY read CSV from a remote location?

I've been using JDBC with a local Postgres DB to copy data from CSV files into the database with the Postgres COPY command. I use Java to parse the existing CSV file into a CSV format matches the tables in the DB. I then save this parsed CSV to my local disk. I then have JDBC execute a COPY command using the parsed CSV to my local DB. Everything works as expected.
Now I'm trying to perform the same process on a Postgres DB on a remote server using JDBC. However, when JDBC tries to execute the COPY I get
org.postgresql.util.PSQLException: ERROR: could not open file "C:\data\datafile.csv" for reading: No such file or directory
Am I correct in understanding that the COPY command tells the DB to look locally for this file? I.E. the remote server is looking on its C: drive (doesn't exist).
If this is the case, is there anyway to indicate to the copy command to look on my computer rather than "locally" on the remote machine? Reading through the copy documentation I didn't find anything that indicated this functionality.
If the functionality doesn't exist, I'm thinking of just populating the whole database locally then copying to database to the remote server but just wanted to check that I wasn't missing anything.
Thanks for your help.
Create your sql file as follows on your client machine
COPY testtable (column1, c2, c3) FROM STDIN WITH CSV;
1,2,3
4,5,6
\.
Then execute, on your client
psql -U postgres -f /mylocaldrive/copy.sql -h remoteserver.example.com
If you use JDBC, the best solution for you is to use the PostgreSQL COPY API
http://jdbc.postgresql.org/documentation/publicapi/org/postgresql/copy/CopyManager.html
Otherwise (as already noted by others) you can use \copy from psql which allows accessing the local files on the client machine
To my knowledge, the COPY command can only be used to read locally (either from stdin or from file) from the machine where the database is running.
You could make a shell script where you run you the java conversion, then use psql to do a \copy command, which reads from a file on the client machine.