Cannot use copy function with jdbc driver [duplicate] - postgresql

I have a project with Spring, Hibernate and PostgreSQL and have to use ANT to create schema with data:
<sql driver="org.postgresql.Driver"
classpath="src/main/webapp/WEB-INF/lib/postgresql-9.1-901.jdbc4.jar"
url="jdbc:postgresql://localhost:5433/postgres"
userid="postgres"
password="pw123"
autocommit="true"
src="src/main/sql/dbbackup.sql">
</sql>
but I get this error:
C:\Users\<user>\<workspace>\<Project>\antdb.xml:22: org.postgresql.util.PSQLException: ERROR: COPY from stdin failed: The JDBC driver currently does not support COPY operations.
Don't know if somehow we could use postgresql.copy class here?

PgJDBC doesn't support COPY directly, but it does via the CopyManager API you can get from the PGConnection interface of the java.sql.Connection returned by PgJDBC.
Unfortunately, you can't use that from a plain SQL file where you mix COPY operations in with other commands.
Personally, I'd shell out to psql to run .sql files using the Ant <exec> task. That way you can include COPY data in-line in your SQL files.
It'd be nice to enable PgJDBC to handle COPY, but it's not easy. It's effectively a different protocol mode in PostgreSQL, and it doesn't make much sense to use the usual JDBC interfaces with prepared statements, execute, etc, for it. We could provide an execSQLScript on the custom PGconnection but that wouldn't help you out much because things like Ant's <sql> task wouldn't use it. You'd have to write a custom task.
Instead, PgJDBC would have to pretty much lie to clients - when it entered COPY mode after a COPY command, it'd have to ignore the JDBC spec and not really do what it was supposed to in response to JDBC statement executes. This would be likely to break all sorts of things.
So - for now, by far the easiest option is to just exec the psql command to do what you want.

Related

What is the easiest way to generate a script to drop and create all objects in a database?

I'm used to working with SQL Server and the SQL Server Management Studio has the option to automatically generate a script to drop and recreate everything in a database (tables/views/procedures/etc). I find that when developing a new application and writing a bunch of junk in a local database for basic testing it's very helpful to have the options to just nuke the whole thing and recreate it in a clean slate, so I'm looking for a similar functionality within postgres/pgadmin.
PGAdmin has an option to generate a create script for a specific table but right clicking each table would be very tedious and I'm wondering if there's another way to do it.
To recreate a clean schema only database you can use the pg_dump client included with a Postgres server install. The options to use are:
-c
--clean
Output commands to clean (drop) database objects prior to outputting the commands for creating them. (Unless --if-exists is also specified, restore might generate some harmless error messages, if any objects were not present in the destination database.)
This option is ignored when emitting an archive (non-text) output file. For the archive formats, you can specify the option when you call pg_restore.
and:
-s
--schema-only
Dump only the object definitions (schema), not data.
This option is the inverse of --data-only. It is similar to, but for historical reasons not identical to, specifying --section=pre-data --section=post-data.
(Do not confuse this with the --schema option, which uses the word “schema” in a different meaning.)
To exclude table data for only a subset of tables in the database, see --exclude-table-data.
clean in Flyway
The database migration tool Flyway offers a clean command that drops all objects in the configured schemas.
To quote the documentation:
Clean is a great help in development and test. It will effectively give you a fresh start, by wiping your configured schemas completely clean. All objects (tables, views, procedures, …) will be dropped.
Needless to say: do not use against your production DB!

How can I import a large (multi-GB) sql file into postgres using dotnet core?

My database needs to mirror another, to which I have no access except for a nightly export of the sql file. I could script the import using psql.exe, but would prefer everything to be under the control of the dotnet core application.
I can't use the COPY command, because the file contains ALL the sql to set up the schemas and tables, as well as all the sql commands to insert/alter/copy the data.
I can't use \i because that is a postgresql console command, not something I can run through npgsql.
Is what I'm trying to do possible? Is it inherently a bad idea, and should I run a script to import it outside of the dotnet application? Should the dotnet application run and talk to the psql.exe program directly?
You could theoretically parse the SQL file in .NET and send it to PostgreSQL, but this is a very non-trivial thing to do, since you'd need to understand where statements end (identify semicolons) in order to send chunks.
You could, of course, send the entire file as a single chunk, but if it's huge, that may be a bad idea.
At the end of the day, I don't think there's any particular issue with launching psql.exe as an external process from .NET, and properly inspecting its exit code for error handling. Any reason you think you need to avoid that?

Start firebird backup from sql

In Sybase sqlAnywhere you could do:
BACKUP DATABASE DIRECTORY 'directory'
to trigger a backup.
Is there a similar solution in Firebird?
It would be easier with a sql-command, than having to distribute gbak.exe.
There is no SQL statement to perform a backup, you either need to use gbak.exe, or your application (or a companion application) needs to use the Firebird service API to perform the backup.
For example Jaybird (the Java/JDBC driver for Firebird) and the Firebird ADO.net provider implement this functionality, but it might be simpler just to include gbak.exe and call it from within your application with the right command line options.

Setting up environment for SQL queries

I know the basic syntax of queries but otherwise I'm a beginner with SQL.
I have an SQL file (.sql) and I downloaded a couple programs (pgadmin and sql workbench).
I have no idea how to get from where I am now to actually writing queries and finding information. How do I set up so I can actually import my SQL file and start writing queries?
pgAdmin is the default GUI for PostgreSQL.
SQL Workbench is a free, DBMS-independent, cross-platform SQL query tool.
Either way, you need to connect to a database to actually run queries. The DBMS can either run on your local machine or you can connect to a remote server - where you need access privileges of course.

Is there any easy tool to load CSVs into PostgreSQL?

Is there any easy tool to load CSVs into PostgreSQL?
I know there's the POSTGIS DBF loader tool but I was wondering if there's any non-commercial or commercial add-on that allows one to easily load in a CSV.
The COPY command, built-in to PostgreSQL, does exactly what you want. It's most useful when used in its \copy variant via psql.
Check the documentation for your particular Pg version, as COPY options vary. In future please mention your Pg version when posting. Assuming you're on 9.1, then from a psql client cou could use:
\copy target_table from 'the_file.csv' with (format csv)
and possibly other options, as documented in the link above, depending on the details of your CSV dialect.
Note that the \copy command will not work from PgAdmin-III or other clients; it's specific to psql. Regular COPY works from any client, but requires that the file be accessible by the database server's postgres process, so it's much less convenient.
You can also use pg_bulkload or ETL tools like Talend and Pentaho if the job is huge or more complicated.