Error on loading data to remote DB2 server - db2

I'm new to Db2. I'm trying to send data from remote Db2 server A to remote Db2 server B using a Java based application. I was able to fetch the data from server A and get it stored in the control/data files; but when I try to send the data to server B, I get following exception.
com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-104, SQLSTATE=42601, SQLERRMC=EXTERNAL;T_DATA SELECT * FROM;<table_expr>, DRIVER=4.26.14
The control file has the command:
INSERT INTO <TABLE_NAME> SELECT * FROM EXTERNAL '<PATH_TO_DATAFILE>'
USING (DELIMITER '\t' FORMAT TEXT SOCKETBUFSIZE 100 REMOTESOURCE 'JDBC')
The data file contains records where each value separated by tab per record.
Both server A and B are using Db2 v9.5

The failure was caused by the target server-B being an out of support version of Db2 (v9.5) that does not have any ability to understand external tables. Hence it reported (correctly) sqlcode -104 on the token EXTERNAL which it did not understand.
So the design is incorrect for the available Db2-versions at your site. You can only use external tables in Db2-LUW versions that are recent (v11.5).
Depending on the tools available, you can use commands (external tools, not SQL) to export data from the source, and load it into the target. Additionally, if there is network connectivity directly between server-A and server-B then an administrator can arrange federation between them allowing direct inserts.
Db2 v9.5 also supported load from cursor, and load from remote cursor (although there were problems, long since fixed in newer versions).

Related

dBeaver (CE): DB2 LUW Connection with SQL ERROR 42704. Table Schema won't open but able to write SQL queries

After about a year and half, I am finally able to connect to the DB2 database we have through dBeaver. The connection is successful as a LUW (Our db2 is z/os). I was able to get the drivers required after installing IBM Data Studio.
Once I am connected, I go down the schema, get to Tables, and on clicking that, I get the below error.
SQL Error [42704]: SYSCAT.SCHEMATA IS AN UNDEFINED NAME. SQLCODE=-204, SQLSTATE=42704, DRIVER=3.69.56
SYSCAT.SCHEMATA IS AN UNDEFINED NAME. SQLCODE=-204, SQLSTATE=42704, DRIVER=3.69.56
THE DESCRIBE STATEMENT DOES NOT SPECIFY A PREPARED STATEMENT. SQLCODE=-516, SQLSTATE=26501, DRIVER=3.69.56
THE CURSOR SQL_CURLH200C1 IS NOT IN A PREPARED STATE. SQLCODE=-514, SQLSTATE=26501, DRIVER=3.69.56
SQL Error [42704]: SYSCAT.SCHEMATA IS AN UNDEFINED NAME. SQLCODE=-204, SQLSTATE=42704, DRIVER=3.69.56
However, if ignore the error and go to New SQL query and write a simple
Select * from schema.table
it works fine and get the results I want.
Considering the time i have spent to get till here, this is sufficient, but to deploy as a solution in my department, I need to be able to look at a Table List (schema).
Any help would be awesome.
EDIT1: What the issue is here, is that there is no SCHEMA with the name SYSCAT and no table named SCHEMATA.
The z/OS Db2 catalog has different names than the ones used on Db2 on distributed (Linux Unix Windows aka LUW). Here is a list of objects on Db2 z/OS that you can review.
It looks like you are using dBeaver to navigate through a UI the objects on Db2 for z/OS. You will need to ensure you have a db2 jcc driver that is for z/OS Db2. It looks like you may be using one from LUW as the SYSCAT.SCHEMATA is an LUW object, not a z/OS object.
Your other query works because you are specifying a known table name. Other queries should be fine. The issue is the interface in dbeaver is looking at Db2 system objects for LUW and not z/OS. This will continue until you are able to resolve the driver issue.
The IBM Data Server Drivers also require server sided set-up. Please see this information https://www.ibm.com/support/knowledgecenter/SSEPEK_12.0.0/java/src/tpc/imjcc_jccenablespsandtables.html
In DBeaver, when you create a connection choose the "DB2 z/OS driver" option under the Db2 drop down, when connecting to DB2 for z/OS
BTW DBeaver can shell share with Data Studio, so you can (if you wish) use both products in one install. No guarantees that they share happily in all cases, but it appears to work reasonably well.

Oracle SQL Developer database diff doesn’t list my connections

I’m trying to perform a Diff on two DB2 schema’s and when I try to select my source and destination connections it doesn’t list my NEW connection. I’m using latest version. I can connect to DB2 manually and query as well, but just can’t sect that connection during database Diff.
Where are the connections saved in sql
Developer?
That feature is reserved for Oracle Database connections.
DB2 connectivity is provided for migrations to Oracle only.
We have limited support for DB2 in the Data Modeler (which is part of SQL Developer), and you can compare models, but the generation of DDL synch scripts is reserved for Oracle data models only.

Export ixf in db2

EXPORT TO myFile.ixf OF ixf SELECT * FROM TABLE_NAME WHERE SSN='DATA' AND EMPLOYER_ID=DATA AND CREATED_TS='DATA'
I am using this statement to export a couple of rows. for privacy purposes DATA has been inserted where necessary. however the following error is produced. I have followed IBM's guide on export and feel like this should be correct but unsure exactly as to what is wrong. the error log is as follows
Error: DB2 SQL Error: SQLCODE=-104, SQLSTATE=42601, SQLERRMC=myFile;EXPORT TO ;JOIN, DRIVER=3.53.70
SQLState: 42601
ErrorCode: -104
As already remarked, you cannot directly run Db2-commands (such as import, export, load ... etc.) from plain SQL , as you are trying to do via JDBC.
Instead, if your Db2-server runs on Linux/Unix/Windows, you can either use a stored procedure, or (for any Db2-server operating system) you can use the command-line.
However, when you use stored-procedure SYSPROC.ADMIN_CMD for Db2-LUW, all file-names in stored-procedure parameters are relative to the Db2-server (and not your remote jdbc-client, if you are running remotely).
That means after a successful export via stored-procedure, if you really need the exported IXF file to be on your workstation then you must do file-transfer to your workstation using whatever tools you have for that purpose.
For example, this shows an export on Unix to an IXF file in /tmp on the Db2-server:
call sysproc.admin_cmd('EXPORT TO /tmp/myFile.ixf OF ixf SELECT * FROM user1.stk1 with ur') ;
If you don't want to use a stored procedure, you must use the command-line shell (for example on Windows, use db2ntcmd.bat , or on Unix use bash or ksh) and connect to the database in the shell and perform the export. This requires the workstation to have a Db2-client and also that the relevant database and node be first catalogued.
If you specify your Db2-version and the operating-system on which your Db2-server runs, then you will get more details.

dashDB and DB2 Load operation

I am currently trying to use a dashDB database with the db2cli utility and ODBC (values are from Connect/Connection Information on the dashDB web console). At this moment I can perfectly do SELECT or INSERT statements and fetch data from custom tables which I have created, thanks to the command:
db2cli execsql -connstring "DRIVER={IBM DB2 ODBC DRIVER - IBMDBCL1}; DATABASE=BLUDB; HOSTNAME=yp-dashdb-small-01-lon02.services.eu-gb.bluemix.net; PORT=50000; PROTOCOL=TCPIP; UID=xxxxxx; PWD=xxxxxx" -inputsql /tmp/input.sql
Now I am trying to do a DB2 LOAD operation through the db2cli utility, but I don't know how to proceed or even if it is possible to do so.
The aim is to import data from a file without cataloging the DB2 dashDB database on my side, but only through ODBC. Does someone know if this kind of operation is possible (with db2cli or another utility)?
The latest API version referenced from the DB2 on Cloud (ex DashDB) dashboard is available here. It requires first to call the /auth/tokens endpoint to generate an auth token based on your Bluemix credentials to be used to authorize the API calls.
I've published recently a npm module - db2-rest-client - to simplify the usage of these operations. For example, to load data from a .csv file you can use the following commands:
# install the module globally
npm i db2-rest-client -g
# call the load job
export DB_USERID='<USERID>'
export DB_PASSWORD='<PASSWORD>'
export DB_URI='https://<HOSTNAME>/dbapi/v3'
export DEBUG=db2-rest-client:cli
db2-rest-client load --file=mydata.csv --table='MY_TABLE' --schema='MY_SCHEMA'
For the load job, a test on Bluemix dedicated with a 70MB source file and about 4 million rows took about 4 minutes to load. There are also other CLI options as executing export statement, comma separated statements and uploading files.
This is not possible. LOAD is not an SQL statement, therefore it cannot be executed via an SQL interface such as ODBC, only using the the DB2 CLP, which in turn requires a cataloged database.
ADMIN_CMD() can be invoked via an SQL interface, however, it requires that the input file be on the server -- it won't work with a file stored on your workstation.
If JDBC is an option, you could use the CLPPlus IMPORT command.
You can try loading data using REST API.
Example:
curl --user dashXXX:XXXXXX -H "Content-Type: multipart/form-data" -X POST -F loadFile1=#"/home/yogesh/Downloads/datasets/order_details_0.csv" "https://yp-dashdb-small-01-lon02.services.eu-gb.bluemix.net:8443/dashdb-api/load/local/del/dashXXX.ORDER_DETAILS?hasHeaderRow=true&timestampFormat=YYYY-MM-DD%20HH:MM:SS.U"
I have used the REST API and have not seen any size limitations. In ver 1.11 of dashDB local (warehouse db) external tables have been included. As long as file is on the container it can be loaded. Also the DB2 Load locks the table until load is finished where a external table load won't
There are a number of ways to get data into Db2 Warehouse on Cloud. From a command line you can use Lift CLI https://lift.ng.bluemix.net/ which provides the best performance for large data sets
You can also use EXTERNAL TABLEs https://www.ibm.com/support/knowledgecenter/ean/SS6NHC/com.ibm.swg.im.dashdb.sql.ref.doc/doc/r_create_ext_table.html which are also high performance and have lots of options
This is a quick example using a local file (not on the server) hence the REMOTESOURCE YES option
db2 "create table foo(i int)"
echo "1" > /tmp/foo.csv
db2 "insert into foo select * from external '/tmp/foo.csv' using (REMOTESOURCE YES)"
db2 "select * from foo"
I
-----------
1
1 record(s) selected.
for large files, you can use gzip, either on the fly
db2 "insert into foo select * from external '/tmp/foo.csv' using (REMOTESOURCE GZIP)"
or from gzip'ed files
gzip /tmp/foo.csv
db2 "insert into foo select * from external '/tmp/foo2.csv.gz' using (REMOTESOURCE YES)"

Can db2 import or load be used to populate DashDB?

I'm looking to bulk loads millions of rows into a DashDB database. After connecting using the DB2 CLI, I enter a command like:
db2 import from rowsToImport.csv of del insert into MY_TABLE
with results:
SQL0551N "DASHXXX" does not have the required authorization or privilege to
perform operation "BIND" on object "NULLID.SQLUAJ19". SQLSTATE=42501
Is this an inherent limitation with DashDB, or is something configured incorrectly on my client? I get a similar message when trying db2 load:
SQL2019N An error occurred while utilities were being bound to the database.
p.s. I'm aware of the rest client api for DashDB for loading data - I'm asking specifically how/if bulk loads can be done with the DB2 command line as an alternate option.
As per dashDB documentation you can use the Command line processor plus (CLPPlus). It is included in the dashDB driver package and provides a command-line user interface that you can use to connect to the dashDB database, BLUDB. You can use CLPPlus to define, edit, and run statements, scripts, and commands. Please take also a look at Connecting CLPPlus to the dashDB database to see how to connect and use the CLI.
Please note that in CLPPlus: IMPORT, EXPORT and LOAD commands have a restriction that processed files must be on the server: see here. So you should copy the input load file onto the remote server first with SCP. However SSH/SCP protocol should be blocked (not accessible) for a normal dashDB user.
Only geospatial data can be loaded from your local machine to dashDB, using IDA LOADGEOSPATIALDATA command in CLPPlus.
The file to be loaded in dashDB using the above command can be in the local file system, accessible to the CLPPlus user.
Alternative ways to do that are:
dashDB REST API (as you already mentioned). See Load delimited data using the REST API and cURL.
load the csv directly from the dashDB dashboard on Bluemix. See Loading data from the desktop into IBM dashDB.
load the csv using IBM Data Studio. See dashDB large file load using IBM Data Studio.
According to this technote, the package NULLID.SQLUAJ19 belongs to one of the early DB2 10.1 fix packs, so I suspect your client version is 10.1. When attempting to execute the IMPORT command it needs to bind some packages of that older version, since dashDB is DB2 10.5, obvisouly.
You may want to try installing the latest DB2 client fix pack, as the necessary packages may be already bound in the database.
To verify that you could run select pkgname from syscat.packages where pkgschema = 'NULLID' and pkgname like 'SQLUA%' -- you should see "SQLUAK20", which seems to be the corresponding package in DB2 10.5.
If that doesn't work, your other option might be to move to a dedicated dashDB instance, as you won't have sufficient privileges to bind missing packages in the entry-level shared dashDB service.