Export ixf in db2 - db2

EXPORT TO myFile.ixf OF ixf SELECT * FROM TABLE_NAME WHERE SSN='DATA' AND EMPLOYER_ID=DATA AND CREATED_TS='DATA'
I am using this statement to export a couple of rows. for privacy purposes DATA has been inserted where necessary. however the following error is produced. I have followed IBM's guide on export and feel like this should be correct but unsure exactly as to what is wrong. the error log is as follows
Error: DB2 SQL Error: SQLCODE=-104, SQLSTATE=42601, SQLERRMC=myFile;EXPORT TO ;JOIN, DRIVER=3.53.70
SQLState: 42601
ErrorCode: -104

As already remarked, you cannot directly run Db2-commands (such as import, export, load ... etc.) from plain SQL , as you are trying to do via JDBC.
Instead, if your Db2-server runs on Linux/Unix/Windows, you can either use a stored procedure, or (for any Db2-server operating system) you can use the command-line.
However, when you use stored-procedure SYSPROC.ADMIN_CMD for Db2-LUW, all file-names in stored-procedure parameters are relative to the Db2-server (and not your remote jdbc-client, if you are running remotely).
That means after a successful export via stored-procedure, if you really need the exported IXF file to be on your workstation then you must do file-transfer to your workstation using whatever tools you have for that purpose.
For example, this shows an export on Unix to an IXF file in /tmp on the Db2-server:
call sysproc.admin_cmd('EXPORT TO /tmp/myFile.ixf OF ixf SELECT * FROM user1.stk1 with ur') ;
If you don't want to use a stored procedure, you must use the command-line shell (for example on Windows, use db2ntcmd.bat , or on Unix use bash or ksh) and connect to the database in the shell and perform the export. This requires the workstation to have a Db2-client and also that the relevant database and node be first catalogued.
If you specify your Db2-version and the operating-system on which your Db2-server runs, then you will get more details.

Related

Error on loading data to remote DB2 server

I'm new to Db2. I'm trying to send data from remote Db2 server A to remote Db2 server B using a Java based application. I was able to fetch the data from server A and get it stored in the control/data files; but when I try to send the data to server B, I get following exception.
com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-104, SQLSTATE=42601, SQLERRMC=EXTERNAL;T_DATA SELECT * FROM;<table_expr>, DRIVER=4.26.14
The control file has the command:
INSERT INTO <TABLE_NAME> SELECT * FROM EXTERNAL '<PATH_TO_DATAFILE>'
USING (DELIMITER '\t' FORMAT TEXT SOCKETBUFSIZE 100 REMOTESOURCE 'JDBC')
The data file contains records where each value separated by tab per record.
Both server A and B are using Db2 v9.5
The failure was caused by the target server-B being an out of support version of Db2 (v9.5) that does not have any ability to understand external tables. Hence it reported (correctly) sqlcode -104 on the token EXTERNAL which it did not understand.
So the design is incorrect for the available Db2-versions at your site. You can only use external tables in Db2-LUW versions that are recent (v11.5).
Depending on the tools available, you can use commands (external tools, not SQL) to export data from the source, and load it into the target. Additionally, if there is network connectivity directly between server-A and server-B then an administrator can arrange federation between them allowing direct inserts.
Db2 v9.5 also supported load from cursor, and load from remote cursor (although there were problems, long since fixed in newer versions).

How to make Oracle SQL Developer export NLS-safe SQL dumps?

I used Tools -> Database Export in Oracle SQL Developer 18.2.0 to generate full schema and data dump.
Then I attempted to use that dump in a shell script calling sqlplus and got the following error:
Insert into CONN.ACCOUNT (ID,CUSTOMER_ID,LAST_MODIFIED) values ('1','1',
to_timestamp('2018.09.06 17:45:29,000000000','RRRR.MM.DD HH24:MI:SSXFF'))
*
ERROR at line 1:
ORA-01830: date format picture ends before converting entire input string
Most probably, that was caused by NLS settings in Oracle SQL Developer 18.2.0. I did not touch them, SQL Developer seems to have picked them up from default Windows settings.
Is there any way to make Oracle SQL Developer generate safe export dumps that could be later imported through sqlpus without manually hunting for all required NLS settings and adding them at the beginning of the sql dump file?

Import data fails in DB2

I'm using Data Studio to connect to a DB2 server. When I'm trying to use the 'import utility' in the Data Studio, it succeeds with a warning and the result show that no record has been inserted into the database. The Import wizard is generating the following SQL command
CALL SYSPROC.ADMIN_CMD( 'IMPORT FROM "/home/xyz/backup/TRANSACTION" OF DEL MODIFIED BY coldel| delprioritychar INSERT INTO S.TRANSACTION' );
If I copy this command and paste it in a sql script in DB2 and then run it it give another error
An I/O error (reason = "sqlofopn -2029060079") occurred while opening the input file.. SQLCODE=-3030, SQLSTATE=
If I use the db2 shell to execute the IMPORT part of the command (without CALL SYSPROC.ADMIN_CMD) it succeed without any issue. What is wrong here?
When you (or DataStudio) runs SYSPROC.ADMIN_CMD (which is the default method used by DataStudio for import), the action happens on the Db2-server using the account of the Db2-instance-owner (for Db2-LUW).
That account (for example db2inst1) requires read access to the specified filename. In your case, the Db2-instance owner did not have access to the file (and/or the path containing the file), so the exception got thrown.
You may see additional detail in the Db2-server diagnostic file (db2diag.log) for the failed action, depending on the diagnostics level that is active on the Db2-server.
ADMIN_CMD expects the input file to be on the server, because it (as any other stored procedure) runs on the server; it has no access to your local file system.
Commands you run in the Db2 command line processor execute on the client and therefore can access the file locally.

SQL DATABASE(postgresql)

ERROR: could not open file "C:\Users\lenovo\Downloads\Owners.csv" for reading: Permission denied
HINT: COPY FROM instructs the PostgreSQL server process to read a file. You may want a client-side facility such as psql's \copy.
SQL state: 42501
I am trying to import a csv file into postgresql. But this error pops up. I search everywhere. But i Couldn't get the answer of it PLEASE HELP ME.
THANKS IN ADVANCE!!
COPY mytable FROM /path/thefile.csv WITH CSV,HEADER; is executed by the DBMS server, the .csv-file is read by the server. The server (typically) runs as user postgres, which cannot access arbitrary users's files. (Also: the client and server don't have to be running on the same machine) There are two possible solutions to this:
copy the csv-file to a place where the server can access it, in /tmp/, or somewhere under its home-directory.
use psql's \copy mytable(col1,col2,...) FROM '/path/file.csv'... (slightly different syntax)

dashDB and DB2 Load operation

I am currently trying to use a dashDB database with the db2cli utility and ODBC (values are from Connect/Connection Information on the dashDB web console). At this moment I can perfectly do SELECT or INSERT statements and fetch data from custom tables which I have created, thanks to the command:
db2cli execsql -connstring "DRIVER={IBM DB2 ODBC DRIVER - IBMDBCL1}; DATABASE=BLUDB; HOSTNAME=yp-dashdb-small-01-lon02.services.eu-gb.bluemix.net; PORT=50000; PROTOCOL=TCPIP; UID=xxxxxx; PWD=xxxxxx" -inputsql /tmp/input.sql
Now I am trying to do a DB2 LOAD operation through the db2cli utility, but I don't know how to proceed or even if it is possible to do so.
The aim is to import data from a file without cataloging the DB2 dashDB database on my side, but only through ODBC. Does someone know if this kind of operation is possible (with db2cli or another utility)?
The latest API version referenced from the DB2 on Cloud (ex DashDB) dashboard is available here. It requires first to call the /auth/tokens endpoint to generate an auth token based on your Bluemix credentials to be used to authorize the API calls.
I've published recently a npm module - db2-rest-client - to simplify the usage of these operations. For example, to load data from a .csv file you can use the following commands:
# install the module globally
npm i db2-rest-client -g
# call the load job
export DB_USERID='<USERID>'
export DB_PASSWORD='<PASSWORD>'
export DB_URI='https://<HOSTNAME>/dbapi/v3'
export DEBUG=db2-rest-client:cli
db2-rest-client load --file=mydata.csv --table='MY_TABLE' --schema='MY_SCHEMA'
For the load job, a test on Bluemix dedicated with a 70MB source file and about 4 million rows took about 4 minutes to load. There are also other CLI options as executing export statement, comma separated statements and uploading files.
This is not possible. LOAD is not an SQL statement, therefore it cannot be executed via an SQL interface such as ODBC, only using the the DB2 CLP, which in turn requires a cataloged database.
ADMIN_CMD() can be invoked via an SQL interface, however, it requires that the input file be on the server -- it won't work with a file stored on your workstation.
If JDBC is an option, you could use the CLPPlus IMPORT command.
You can try loading data using REST API.
Example:
curl --user dashXXX:XXXXXX -H "Content-Type: multipart/form-data" -X POST -F loadFile1=#"/home/yogesh/Downloads/datasets/order_details_0.csv" "https://yp-dashdb-small-01-lon02.services.eu-gb.bluemix.net:8443/dashdb-api/load/local/del/dashXXX.ORDER_DETAILS?hasHeaderRow=true&timestampFormat=YYYY-MM-DD%20HH:MM:SS.U"
I have used the REST API and have not seen any size limitations. In ver 1.11 of dashDB local (warehouse db) external tables have been included. As long as file is on the container it can be loaded. Also the DB2 Load locks the table until load is finished where a external table load won't
There are a number of ways to get data into Db2 Warehouse on Cloud. From a command line you can use Lift CLI https://lift.ng.bluemix.net/ which provides the best performance for large data sets
You can also use EXTERNAL TABLEs https://www.ibm.com/support/knowledgecenter/ean/SS6NHC/com.ibm.swg.im.dashdb.sql.ref.doc/doc/r_create_ext_table.html which are also high performance and have lots of options
This is a quick example using a local file (not on the server) hence the REMOTESOURCE YES option
db2 "create table foo(i int)"
echo "1" > /tmp/foo.csv
db2 "insert into foo select * from external '/tmp/foo.csv' using (REMOTESOURCE YES)"
db2 "select * from foo"
I
-----------
1
1 record(s) selected.
for large files, you can use gzip, either on the fly
db2 "insert into foo select * from external '/tmp/foo.csv' using (REMOTESOURCE GZIP)"
or from gzip'ed files
gzip /tmp/foo.csv
db2 "insert into foo select * from external '/tmp/foo2.csv.gz' using (REMOTESOURCE YES)"