DB2 read bulk data - db2

I am reading bulk data from database with a size of 100mb.
What I am doing is reading it in a procedure and building a xml which is amazingly very fast.
So I am calling the stored procedure via Hibernate through JDBC in which the transmission takes too long, because of the large xml.
I am thinking of to zip the file in the procedure.
Does DB2 LUW supporting zipping from a stored procedure?

Related

What is the equivalent for sql loader in postgresql?

What is the equivalent for sqlldr(used for oracle) in postgresql?
I am trying to find equivalent for sql loader utility(sqlldr) in postgres.
pg loader emulates oracles sql loader:
http://pgfoundry.org/projects/pgloader/
pg bulkload is used to load lots of data in an otherwise offline db. Useful for large data warehouses, fast, and somewhat dangerous and quirky:
http://pgbulkload.projects.postgresql.org/

How to do BULK INSERT into PostgreSQL without specifying physical csv file in Apache NiFi

I need to do BULK INSERT into PostgreSQL table in Apache NiFi without specifying the physical csv file in COPY command. I just cannot store the CSV files on the disk and would like to do BULK INSERT using a flow file that is coming from previous processors and is already in CSV format (or I can change it to json, that's not an issue).
Please advise, what is the best way to do this in Apache NiFi?
PostgreSQL's COPY operation requires a path to a file. I'd recommend looking at PutDatabaseRecord which generates a PreparedStatement based on your CSV data and executes the statement in a single batch (unless Maximum Batch Size is set)

Difference between copy/migrate/export in SQL developer

I am using Oracle SQL developer, it has the following tools,
DATABASE copy, DATABASE export and Migrate.
I want to move one schema and all the data in it from one server to another.
What is the difference between these options? Does anything serve what I am looking for?
Database Copy is probably what you want.
Supply two database connections, and we'll take objects and data and copy them from one database to another.
However, if your schema is large, this will be inefficient. The Copy routine does inserts, row-by-row across the jdbc connections.
Database Export takes the objects and data and offloads them to flat files. These flat files could then be used later to put in another database.
Migrate is used to take a database from SQL Server, Sybase, Teradata, Redshift, DB2, etc. to Oracle. It has an online (jdbc row-by-row) data copy and an offline (flat files for SQL Loader) data move mode. For SQL Server/Sybase, we can also translate the T-SQL stored procedures to PL/SQL.
Your solution might also lie elsewhere - Data Pump. We have a wizard for that as well, and works great for very large schemas/databases. You'll just need access to the database OS so you can put the DMP files into a Database Directory.

Tableau Data store migration to Redshift

Currently we have a workbook developed in Tableau using Oracle server as the data store where we have all our tables and views. Now we are migrating to Redshift fora better performance. We have the same table structure as in the Oracle with the same table names and the field names in the Redshift. We already have the Tableau workbook developed and we need to point to Redshift tables and views now. How do we point the developed workbook to Redshift now, kindly help.
Also let me know any other inputs in this regard.
Thanks,
Raj
Use the Replace Data Source functionality of Tableau Desktop
You can bypass Replace Data Source and move data directly from Oracle to Redshift using bulk loaders.
Simple combo of SQL*Plus + Python + boto + psycopg2 will do the job.
It should:
Open read pipe from Oracle SQL*Plus
Compress data stream
Upload compressed stream to S3
Bulk append data from S3 to Redshift table.
You can check example of how to extract table or query data from Oracle and then load it to Redshift using COPY command from S3.

DB2-clob data to PostgreSQL

We are planning to move data in DB2(28) to PostgreSQL(9.2).
We have already created database schema and tables in PostgreSQL. I am able to do data export from DB2 to csv format.
For importing data in PostgreSQL, the docs say "Copy command" which copies data from a file to the table.
In DB2, if data is of CLOB type, then separate file is created where CLOB data is kept. The main (data.csv) file contains references to CLOBs. How to import CLOB data in such cases?
I searched on net but could not find any opensource tool from PostgreSQL.
This is not a ready-to-use solution but may be a starting point. As far as I remember, DB2 is offering a ODBC interface. On the other hand on PostgreSQL you are able to "import" ODBC databases via Foreign data wrapper.The first step can be found at documentation. May be worth a try.