Datastage Oracle Importing Table - datastage

I have a error when i use datastage connect to Oracle and import the table definition. Below is the detailed situation.
enviroment:
OS:AIX6.1,64bit,power6 processor, LANG=en_US
Data Stage Version:8.5
Installation profile
three tiers install on same machine, repository use DB2 (default).
Oracle Client 11.2 (64bit) also install on this machine, I can use SQLPLUS connect to Oracle server (11.2, 64bit, AL32UTF8) on another machine.
"dsenv" setting
add "/oracle/product/11.2.0-64/lib" to the "LIBPATH"
add "export TNS_ADMIN=/oracle/product/11.2.0-64/network/admin"
Problem
1. I use Oracle Connector(parallel) create a Link, then use this Link import Metadata, when i press Test connection, there is a dialog with "The OCI function OraOCIEnvNlsCreate:OCI_UTF16ID returned status -1. Error code: NULL, Error message: NULL" popup, and the connection failed.
I use Oracle Enterprise(parallel) create a Link, then use it import Metadata, when i click Ellipsis button list all the tables in target database, there is a dialog with "cannot get list of table names from database" popup, after I click OK on this dialog, the detail error message popup.
12:37:21(002) Unable to access database oracleLibrary orchoracle could not be loaded; Could not load "orchoracle": 
0509-022 Cannot load module /opt/IBM/InformationServer/Server/DSComponents/bin/orchoracle.o.
0509-150 Dependent module /opt/IBM/InformationServer/Server/DSComponents/bin/libclntsh.so could not be loaded.
0509-103 The module has an invalid magic number.
0509-022 Cannot load module /opt/IBM/InformationServer/Server/DSComponents/bin/orchoracle.o.
0509-150 Dependent module /opt/IBM/InformationServer/Server/DSComponents/bin/orchoracle.o could not be loaded.
from the message I found the DS search some files in DSCompoments/bin, but these files are in the oracle bin directory. I can't find the error in dsenv file, so i copied these files into DSComponents/bin, this time the error message changed to "OCI_ERROR: Bad Oracle environment".
I am not sure which enviroment variable I missed, please tell me.
I use Oracle OCI(Server) create a Link and import a table, it works fine.
So, my question is why I can't use the Oracle Connector and Oracle Enterprise to connect the Oracle. Thanks.

Yes the PATH variable needs to be set to $ORACLE_HOME/bin. Adding this variable to the dsenv file and recycling all services fixed the Oracle COnnector issue for us. It is required to be added to the dsenv file and recycling ASBNode and datastage is also required. Here are the directives needed in the dsenv file to use Oracle Connector :(eg is from our system AIX 6.1 , datastage 8.5 connecting to Oracle 11g Enterprise)
We also added the following :
TNS_ADMIN=/opt/oracle/product/11.1.0/client_1/network/admin; export TNS_ADMIN

Related

Is there a way to use Flyway on AS400?

I need to implement migration tool like Flyway in order to use Jenkins to deploy DB changes.
I tried to add jt400.jar file and added configuration as follows:
flyway.url=jdbc:as400://192.168.171.251:446/DBDEV
flyway.driver=com.ibm.as400.access.AS400JDBCDriver
as a driver and it would not connect with this message:
ERROR: No database found to handle jdbc:as400://192.168.171.251:446/DBDEV
I also tried with using IBM DB2 driver and had configuration
flyway.url=jdbc:db2://192.168.171.251:50000/DBDEV
flyway.driver=com.ibm.db2.jcc.DB2Driver
this time I am getting this kind of refusal message
ERROR:
Unable to obtain connection from database (jdbc:db2://192.168.171.251:50000/DBDEV) for user 'DEVUSER':
[jcc][t4][2043][11550][4.26.14] Exception java.net.ConnectException: Error opening socket to server
/192.168.171.251 on port 50,000 with message: Connection refused (Connection refused).
ERRORCODE=-4499, SQLSTATE=08001
With this test migration I am trying to create a simple table by executing this sql
CREATE TABLE PERSON (
ID INT NOT NULL,
NAME VARCHAR(100) NOT NULL
);
Anyone had this situation and solved it?
I believe that at present there is no support for flyway to work with IBM i (as/400) regardless of whether you use jt400.jar or an IBM jdbc driver.
You can either use a different database-schema versioning tool, or find a fork of flyway that supports i-series (or pay someone to create and support such a fork, it is open source...).
It seems that currently (flyway 7.7.2) does not recognize a URL that starts with "jdbc:as400:" as a Db2 URL, so it throws an exception, which is the reason that the jt400.jar style URL is rejected with exception:
"No database found to handle ..."
The github history tells a story (see: https://github.com/flyway/flyway/issues/105).
Looks like the devs did not succeed to get the AS400 support added due to the lack of a suitable available i-series testing/dev environment (and also available to travis ci) . There may have been at least one PR for such support in the past, although it seems to be removed.
If you try to use the IBM db2jcc4.jar driver to connect to i-series (as400) with a url similar to: jdbc:db2://hostname/dbname, and you explicitly use an IBM jre , and have the relevant license file (e.g. db2jcc_license_cisuz.jar on the CLASSPATH), then flyway will connect and then report the exception similar to:
Unsupported Database: AS 7.4
The flyway source code shows that flyway does not recognize this database product-name and version, at current flyway version 7.7.2.
Are you sure DBDEV is the name of your Db2 data base on the IBM i?
Use the Work with RDB Directory Entry (WRKRDBDIRE) from the green screen, and look for the *LOCAL entry.
Or use the Access Client Solutions (ACS) "Schemas" tool to see a list of DB on your system.
The above shows 2 DB's, UT29p63 and Dbtest

Connecting to DB2 HammerDB

I am using a Windows Machine to connect to a remote DB2 instance. Ran into this issue
SQL1531N The connection failed because the name specified with the DSN connection string keyword could not be found in either the db2dsdriver.cfg configuration file or the db2.cli.ini configuration file. Data source name specified in the connection string: <DSN>
I have configured ODBC Data source using ODBC Data Source Administrator it has connected successfully.
Upon further investigation, I am unable to locate db2dsdriver.cfg on IBM DATA SERVER DRIVER folder. I am able to find db2dsdriver.lvl and dbs2dsdriver.xds. Just not the .cfg file. I am also unsure where HammerDB looks for the config file.
I have looked at the configuration of DB2 from the website but I am unable to get any useful information from there. https://www.hammerdb.com/docs/ch04s02.html
For the tiny footprint ODBC and CLI driver (known as clidriver) from IBM, you are responsible for creating and editing the db2dsdriver.cfg configuration file. It is a small XML file documented here and in related linked pages. The hammerdb documentation also gives a minimal example and you linked to this page in your question.
You can create and edit this file either by command lines to the db2cli tool, or by directly editing with a text editor (or XML editor). It may be easier to use an editor than to learn the command lines, although command lines have the advantage that they lend themselves to scripting this activity for larger installations.
On Microsoft-Windows you can also use Notepad to create and edit the file db2dsdriver.cfg.
An important step is that following editing of the file you must first validate its contents before trying any database connections. Validation checks that the syntax of the XML in the file is correct. To validate, you use the db2cli validate command described here. It must show a successful result before you try to connect to any database. Once validation completes without errors, you can also use db2cli validate -connect -dsn XXX -user YYY -passwd ZZZ to test the connection independently of your application (in this case hammerdb). Once you get a successful connection with the db2cli validate -connect -dsn ... then your application (hammerdb) will connect correctly.
There are many examples of db2dsdriver.cfg contents online , but your first source should be the Db2 Knowledge Centre online, which details the command line options to the db2cli command, along with giving examples of db2dsdriver.cfg.
If you already have a working Db2 configuration with local and remote databases (but no db2dsdriver.cfg file), you can also use a tool db2dsdcfgfill to populate db2dsdriver.cfg from your existing Db2 configuration. See docs here.

Why does running a query return 'table_oid' under messages and nothing under Data Output

Running any sort of query on pgAdmin 4 just returns a 'table_oid' under messages.
I am able to get the necessary data when running the query from the command line, for example
SELECT ST_MakePolygon(ST_GeomFromText('LINESTRING(75.15 29.53,77 29,77.6 29.5, 75.15 29.53)'));
I understand that table_oid refers to the object id of the table, but I have no idea how to access it.
Pardon me please if it is a simple question, but I am unable to find any resources online
Expected:
010300000001000000040000009A99999999C9524048E17A14AE873D4000000000004053400000000000003D4066666666666653400000000000803D409A99999999C9524048E17A14AE873D40
Actual:
table_oid
From pgAdmin 4 project tracker:
Temporary solution until next release and tested in my ubuntu 18.04 machine:
Replace these two 2 files:
/usr/share/pgadmin4/web/pgadmin/tools/sqleditor/__init_*_py
/usr/share/pgadmin4/web/pgadmin/tools/sqleditor/command.py
Link to files:
__init_*_.py
command.py
Note:
The first file's name is __init_*_py, without the *.
Try Squirrel SQL - Universal SQL Client. It's an extremely useful SQL client. I use it to access SQL Server, PostgreSQL, MySQL, Access. It's not as good looking as pgAdmin4.
Install JAVA first, if not already installed.
Install Squirrel SQL.
Download the latest PostgreSQL JDBC driver, e.g. postgresql-42.2.6.jar, and put it into a convenient location.
Open/Start Squirrel.
Click the Drivers Tab and scroll down to PostgreSQL. Double click PostgreSQL. A "Change Driver: PostgreSQL" dialog box/window will open.
Click the Extra Class Path tab and click the Add button. Navigate to and choose the PostgreSQL JDBC Driver that was downloaded in step 3. above.
Click the List Drivers button, "org.postgresql.Driver" should appear in the Class Name drop down box.
Click OK.
Setup PostgreSQL JDBC Driver
The driver should now be setup.
Click the Aliases tab to setup a connection to your database. See my example screenshots.
Setup Database Connection

Can db2 import or load be used to populate DashDB?

I'm looking to bulk loads millions of rows into a DashDB database. After connecting using the DB2 CLI, I enter a command like:
db2 import from rowsToImport.csv of del insert into MY_TABLE
with results:
SQL0551N "DASHXXX" does not have the required authorization or privilege to
perform operation "BIND" on object "NULLID.SQLUAJ19". SQLSTATE=42501
Is this an inherent limitation with DashDB, or is something configured incorrectly on my client? I get a similar message when trying db2 load:
SQL2019N An error occurred while utilities were being bound to the database.
p.s. I'm aware of the rest client api for DashDB for loading data - I'm asking specifically how/if bulk loads can be done with the DB2 command line as an alternate option.
As per dashDB documentation you can use the Command line processor plus (CLPPlus). It is included in the dashDB driver package and provides a command-line user interface that you can use to connect to the dashDB database, BLUDB. You can use CLPPlus to define, edit, and run statements, scripts, and commands. Please take also a look at Connecting CLPPlus to the dashDB database to see how to connect and use the CLI.
Please note that in CLPPlus: IMPORT, EXPORT and LOAD commands have a restriction that processed files must be on the server: see here. So you should copy the input load file onto the remote server first with SCP. However SSH/SCP protocol should be blocked (not accessible) for a normal dashDB user.
Only geospatial data can be loaded from your local machine to dashDB, using IDA LOADGEOSPATIALDATA command in CLPPlus.
The file to be loaded in dashDB using the above command can be in the local file system, accessible to the CLPPlus user.
Alternative ways to do that are:
dashDB REST API (as you already mentioned). See Load delimited data using the REST API and cURL.
load the csv directly from the dashDB dashboard on Bluemix. See Loading data from the desktop into IBM dashDB.
load the csv using IBM Data Studio. See dashDB large file load using IBM Data Studio.
According to this technote, the package NULLID.SQLUAJ19 belongs to one of the early DB2 10.1 fix packs, so I suspect your client version is 10.1. When attempting to execute the IMPORT command it needs to bind some packages of that older version, since dashDB is DB2 10.5, obvisouly.
You may want to try installing the latest DB2 client fix pack, as the necessary packages may be already bound in the database.
To verify that you could run select pkgname from syscat.packages where pkgschema = 'NULLID' and pkgname like 'SQLUA%' -- you should see "SQLUAK20", which seems to be the corresponding package in DB2 10.5.
If that doesn't work, your other option might be to move to a dedicated dashDB instance, as you won't have sufficient privileges to bind missing packages in the entry-level shared dashDB service.

Replic-Action log error Cannot open Link: TNS Could not resolve the service name

Hi I am working with ReplicAction tool to transfer data from Lotus Notes View to Oracle Database.
When i Create the link document for Oracle DB it is created successfully without any error
When I create the Include Table for Oracle Db it is created successfully and all columns are listed
When i create the Replication it is also created successfully,
But when the job executes it gives the error is log :
05/08/2012 01:37:16 AM Starting Replication: BADtoProductPortal
05/08/2012 01:37:19 AM Error: <ODBC Error> [DataDirect][ODBC Oracle driver][Oracle]ORA-12154: TNS:could not resolve service name
05/08/2012 01:37:19 AM Error: Information: Unable to open Link: PPLink
05/08/2012 01:37:19 AM Error: Replication to Link <PPLink> did not complete
05/08/2012 01:37:20 AM End of Replication: BADtoProductPortal
If the error is with service name, Then i think we should not be able to create Link document also.
When i use ODBC connection for link, then i am unable to create Replication job, giving the error like Notes Data field "ID" does not match the source data field.
But i know it was working before.
I suggest to check that the TASK runing the job uses the same TNS entry as you are doing "manually".
I suggest to check that the TASK has also access to your Oracle driver. This tasks has right to run it?
ORA-12154 error is thrown during the logon process to a database. This error indicates that the communication software (TNS) in Oracle ( SQL *Net or Net8 ) did not recognize the host/service name specified in the connection parameters.
So the issue is clearly a type en "environment difference" between your configuration when you run manually the replication and when the job run it.
Hope I help
I'm assuming here that when you successfully replicate you're doing it manually from your local machine, and when the job fails it's running scheduled on a server. If that's the case I agree with Emmanuel. Remember running the job locally uses the local tnsnames.ora file, running it scheduled uses the tnsnames.ora file on the server. You may not be aware of anything changing but are you responsible for maintainance on the server?