DB2-clob data to PostgreSQL - postgresql

We are planning to move data in DB2(28) to PostgreSQL(9.2).
We have already created database schema and tables in PostgreSQL. I am able to do data export from DB2 to csv format.
For importing data in PostgreSQL, the docs say "Copy command" which copies data from a file to the table.
In DB2, if data is of CLOB type, then separate file is created where CLOB data is kept. The main (data.csv) file contains references to CLOBs. How to import CLOB data in such cases?
I searched on net but could not find any opensource tool from PostgreSQL.

This is not a ready-to-use solution but may be a starting point. As far as I remember, DB2 is offering a ODBC interface. On the other hand on PostgreSQL you are able to "import" ODBC databases via Foreign data wrapper.The first step can be found at documentation. May be worth a try.

Related

Is there a way to link (not import!) a dbf table into a PostgreSQL database?

I need to link an external dbf table into an existing PostgreSQL database. I managed to import that table into PostgreSQL but that creates a copy and therefor redundant data. The dbf table is the attribute table of a shapefile, but I don’t need the spatial aspect of it.
The exercise is part of a project to move data from an MS Access database to PostgreSQL hoping that the data then become accessible from QGIS. Th dbf table is at the moment linked into the MS Access database and used in queries (views) which I want to re-build in PostgreSQL.
I found lots of posts about importing dbf tables into but nothing which would work about linking a dbf table. The closest I got was the Foreign Data Wrapper, but I didn’t manage to use it for my purpose. I’m using PostgreSQL with pgAdmin 4.24.
Many thanks
The exercise is part of a project to move data from an MS Access database to PostgreSQL hoping that the data then become accessible from QGIS.
If you must use PostgreSQL in order to provide access to your spatial data from QGIS, I see no other option than importing the shapefile into PostgreSQL (PostGIS). If for whatever reason you do not need the geometries, you can drop the geometry column after importing the shapefile into the database:
ALTER TABLE table_name DROP COLUMN column_name;
Alternative scenario:
If we're talking about static shapefiles and you don't really need to use PostgreSQL, you can use GeoServer to publish this shapefile via Web Feature Service (WFS) - it is at least what I do in small projects. The easiest option would be to copy the shapefiles into a so called GeoServer Data Directory and publish them afterwards. After that you'd be able to access the data from QGIS using its WFS Client.

Difference between copy/migrate/export in SQL developer

I am using Oracle SQL developer, it has the following tools,
DATABASE copy, DATABASE export and Migrate.
I want to move one schema and all the data in it from one server to another.
What is the difference between these options? Does anything serve what I am looking for?
Database Copy is probably what you want.
Supply two database connections, and we'll take objects and data and copy them from one database to another.
However, if your schema is large, this will be inefficient. The Copy routine does inserts, row-by-row across the jdbc connections.
Database Export takes the objects and data and offloads them to flat files. These flat files could then be used later to put in another database.
Migrate is used to take a database from SQL Server, Sybase, Teradata, Redshift, DB2, etc. to Oracle. It has an online (jdbc row-by-row) data copy and an offline (flat files for SQL Loader) data move mode. For SQL Server/Sybase, we can also translate the T-SQL stored procedures to PL/SQL.
Your solution might also lie elsewhere - Data Pump. We have a wizard for that as well, and works great for very large schemas/databases. You'll just need access to the database OS so you can put the DMP files into a Database Directory.

Tableau Data store migration to Redshift

Currently we have a workbook developed in Tableau using Oracle server as the data store where we have all our tables and views. Now we are migrating to Redshift fora better performance. We have the same table structure as in the Oracle with the same table names and the field names in the Redshift. We already have the Tableau workbook developed and we need to point to Redshift tables and views now. How do we point the developed workbook to Redshift now, kindly help.
Also let me know any other inputs in this regard.
Thanks,
Raj
Use the Replace Data Source functionality of Tableau Desktop
You can bypass Replace Data Source and move data directly from Oracle to Redshift using bulk loaders.
Simple combo of SQL*Plus + Python + boto + psycopg2 will do the job.
It should:
Open read pipe from Oracle SQL*Plus
Compress data stream
Upload compressed stream to S3
Bulk append data from S3 to Redshift table.
You can check example of how to extract table or query data from Oracle and then load it to Redshift using COPY command from S3.

Is it is possible to convert query on postgresql database to geopackage file

I am working on postgresql database with postgis.
I was generate csv file from the database which includes geo information but I need now to generate geopackage file instead.
I searched on this but I did not found any tools making something like that directly.
I know I can use gdal to convert from csv to geopackage file but I do not need to make that. I need to generate the geopackage file direct from the database.
Can anyone help me in that?
Thanks
gdal support also generate geopackage file from the table/view direct using command like the following:
ogr2ogr -f "GPKG" mynewfilename.gpkg \
PG:"host=localhost user=postgres dbname=postgres password=mypassword" "mytablename"
In theory: yes, it could be done. But in reality: it's almost impossible.
The geopackage file format is based on the SQLite one's. In PostgreSQL you can connect to different databases (in case of SQLite: to files) through foreign data wrappers. There is an SQLite foreign data wrapper, but it's only read-only. There is also a JDBC wrapper, which can support SQLite. But even if you can manage to write a new SQLite database file from PostgreSQL, you will need to study geopackage's internal format -- with just a "Coming soon" Implementation Guide (as of writing).

Copying Data from Oracle to Postgres

We have a daily process that pulls all data out of a number of tables in an Oracle database and imports them into a Postgress (EnterpriseDB) database - Version 8.4.
We are currently using a java application to select * from each table, change the keywords (date, timestamp, etc) and then import them into the Postgres Database.
Are there any tools available in Postgres that would provide a more efficient manner of doing this? I should note that there are CLOBs that are being transported over.
There is Ora2Pg, which is intended as a one-time-migration tool, but it might work in your case as well. I think of it as an Oracle-to-PostgreSQL pg_dump.