I'm rewriting an old application running on SCO Unix that connects to an Informix SE 7.24 database. The target os is RHEL 6.3 and the dbms is PostgreSQL 9.4.
I've already adapted the DDL script and created the empty database but now I'm looking for a way to migrate data. Informix and PostgreSQL are using two different character set, CP437 and UTF8.
I've tried to export the database with the dbexport utility, converted *.unl files to the new charset and then loaded with the COPY table_name FROM 'table.unl' (DELIMITER '|', ENCODING 'UTF-8', NULL ''). This worked for most of the tables but when the size of the .unl file grows (over 1GB), the import process crash. What can I do?
You have not show us COPY error message.
I migrated some databases and one of the easiest way is to use JDBC especially with Jython (Python that works on JVM). You can see such migration in my response: Blob's migration data from Informix to Postgres Of course you must change it to use your schema, but with JDBC it is easy to read table names and other schema info.
Related
I am trying to dump the schema and data from an existing Oracle DB and import it into another Oracle DB.
I have tried using the "Export Wizard" provided by sqldeveloper.
I found answers using Oracle Data Pump, however i do not have access to the filesystem of the DB server.
I expect to get a file that i can copy and import into another DB
Without Data Pump, you have to make some concessions.
The biggest concession is you're going to ask a Client application, running somewhere on your network, to deal with a potentially HUGE amount of data/IO.
Withing reasonable limits, you can use the Tools > Database Export wizard to build a series of SQLPlus style scripts, both DDL (CREATEs) and DATA (INSERTs).
Once you have those scripts, you can use SQLPlus, SQLcl, or SQL Developer to run them on your new/target database.
Is there any feasibility from Oracle to have a connection established to DB2 database so that I can query on DB2 database and generate reports from Oracle Apex?
OR
Is it possible to create a View in Oracle from a remote DB2 database?
OR
What options do i have in order to develop reports in Oracle Apex from the data i have in DB2 database?
(I know, this is an old question and you've already found a workaround. Anyway,) the keyword you might be interested in is gateway. This is Oracle 10g Database Gateway for DB2/400 Installation and User's Guide. I don't know which database you use, but - if 10g is not the one, I hope you'll manage to find the right documentation.
Shortly: after installing the gateway between Oracle and DB2, you'd create a database link. Then, in your Oracle schema, create a view that selects data over that database link from DB2 database. Finally, fetch data in Apex from the view.
As i didn't find a way to directly connect to DB2 from Oracle PL/SQL, i used a work-around. As this is a reporting tool, we are ok to have this tool running with the data which is 1 day off, we did the following:
1) Extract the data required from DB2 database to CSV files. We used a DB2 command which can be run at command line to extract the data into a CSV
2) Then we imported the data into Oracle tables using sqlldr
I need to create reports/summary tables on Redshift using SAS. My client data is on Amazon Redshift and he provided me all credentials to access the database. I have SAS 9.2 (32bit) and downloaded PostgresSQL 32bit driver to my system (as Redshift is based on PostgresSQL). I setup ODBC data source successfully and now I am connecting SAS using below command:
LIBNAME RdSft ODBC DSN='Redshift server' user='xxxxxxx' pw='xxxxxx';
data Rdsft.new_table;
set Rdsft.old_table(obs=10);
run;
I am able to connect and can see contents of tables on Redshift but not able to make any table there. Sometimes I could but its taking hours to create a table just with 10 observations. Someone suggested me to use DbVisulizer to do this task but I am comfortable with SAS only.
Please suggest.
If you have SAS/ACCESS try using the postgres engine for the library instead of going via ODBC eg:
libname RdSft postgres server="<server-address>" database=<db-name> port=5432 user='xxxxxxx' pw='xxxxxx';
Also, try adding conopts="UseServerSidePrepare=1" to the libname as suggested by this article: http://support.sas.com/kb/52/585.html
The simple fact of the matter, is that when you're connecting to Redshift via ODBC, even your simple data step query:
"data Rdsft.new_table;
set Rdsft.old_table(obs=10);
run;"
Is essentially translating to "select * from rdsft.old_table" before the obs subset is getting applied.
The SAS/ACCESS postgres solution is solid, you may also want to use proc sql, select only the columns you want, and subset as much as possible. Proc Sql will translate a bit easier into Redshift query language through an ODBC than the data step will.
SAS will hopefully be issuing a SAS/ACCESS for REDSHIFT option sometime soon! :)
I need to export all tables from SQL Server to PostgreSQL.
Try: I tried from SQL Server IDE but at some stage its giving the error about data types are different.
Question:How can I do export of data from SQL Server to PostgreSQL? Is COPY does my job? If yes, then how can I export all tables including records?
You can't export data from MSsql then import to PostgreSql because it is not same syntax, data type, but you can use tool to migration data from mssql to postgreSql,
See more in topic
migrate data from MS SQL to PostgreSQL?
Use https://dbeaver.io/
Create MS SQL and PostgreSQL database connections (login)
Create target tables in PostgreSQL (same structures in MS SQL)
F5 to see new tables
Right-click on new tables -> 'Import Data' -> You will see 'Data Transfer' window
Choose 'Table' type then click 'Next' -> You will see 'Select input object', where you can choose tables from MS SQL connection
Just 'next' and check settings that you need, done :D
First export the schema into a file and run it against PostgreSQL until you've removed all incompatibilities.
You could try to do the same with the data you want to export but you may be better off writing a Python script to migrate it.
There is an absolutely simple way using built-in SSIS tool using Management Studio. You can find the detailed answer here.
Use https://dbeaver.io/ , as An Le mentioned.
After 40 years of DB development, migrating DB data is still a challenge. DBeaver is a free tool to use for data migration. But you still have to migrate the schema.
Exporting data from DBeaver
From contextual menu of your SQLServer database or schema select Tools > Create new Task > Common > Data Export
You will generate SQL insert files or CSV files. For migration between database types use CSV files.
Cons of SQL Server Migration Tool
Unable to migrate rows containing booleans.
Export ended up in errors of migrationg data with Bool columns, complaining that value is not boolean, although both source and destination columns where of boolean type.
Unable to continue with the next tables afer one table migration fails.
SQL Server - A single error stops all migration even for tables that are not related to the initial error.
Configuring the tool over and over again, trying to export your data is a waste of time. SQL Server migration task does not save the configuration of the source and destination connections. And the wizard is not user friendly, spending your time on it is frustrating. I assume the migration project was abandoned for at least 10 years.
Currently, I have an application that uses Firebird in embedded mode to connect to a relatively simple database stored as a file on my hard drive. I want to switch to using PostgreSQL to do the same thing (Yes, I know it's overkill). I know that PostgreSQL cannot operate in embedded mode and that is fine - I can leave the server process running and that's OK with me.
I'm trying to figure out a connection string that will achieve this, but have been unsuccessful. I've tried variations on the following:
jdbc:postgresql:C:\myDB.fdb
jdbc:postgresql://C:\myDB.fdb
jdbc:postgresql://localhost:[port]/C:\myDB.fdb
but nothing seems to work. PostgreSQL's directions don't include an example for this case. Is this even possible?
You can trick it. If you are running PostGRESQL on a UNIXlike system, then you should be able to create a RAMDISK and use that for the database storage. Here's a pretty good step by step guide for RAMdisks on Linux.
In general though, I would suggest using SQLITE for an SQL db in RAM type of application.
Postgres databases are not a single file. There will be one file for each table and each index in the data directory, inside a directory for the database. All files will be named with the object ID (OID) of db / table / index.
The JDBC urls point to the database name, not any specific file:
jdbc:postgresql:foodb (localhost is implied)
If by "disk that behaves like memory", you mean that the db only exists for the lifetime of your program, there's no reason why you can't create a db at program start and drop it at program exit. Note that this is just DDL to create the DB, not creating the data dir via the init-db program. You could connect to the default 'postgres' db, create your db then connect to it.
Firebird 2.1 onwards supports global temporary tables, which only exist for the duration of the database connection.
Syntax goes something like CREATE GLOBAL TEMPORARY TABLE ... ON COMMIT PRESERVE ROWS