Import data fails in DB2 - import

I'm using Data Studio to connect to a DB2 server. When I'm trying to use the 'import utility' in the Data Studio, it succeeds with a warning and the result show that no record has been inserted into the database. The Import wizard is generating the following SQL command
CALL SYSPROC.ADMIN_CMD( 'IMPORT FROM "/home/xyz/backup/TRANSACTION" OF DEL MODIFIED BY coldel| delprioritychar INSERT INTO S.TRANSACTION' );
If I copy this command and paste it in a sql script in DB2 and then run it it give another error
An I/O error (reason = "sqlofopn -2029060079") occurred while opening the input file.. SQLCODE=-3030, SQLSTATE=
If I use the db2 shell to execute the IMPORT part of the command (without CALL SYSPROC.ADMIN_CMD) it succeed without any issue. What is wrong here?

When you (or DataStudio) runs SYSPROC.ADMIN_CMD (which is the default method used by DataStudio for import), the action happens on the Db2-server using the account of the Db2-instance-owner (for Db2-LUW).
That account (for example db2inst1) requires read access to the specified filename. In your case, the Db2-instance owner did not have access to the file (and/or the path containing the file), so the exception got thrown.
You may see additional detail in the Db2-server diagnostic file (db2diag.log) for the failed action, depending on the diagnostics level that is active on the Db2-server.

ADMIN_CMD expects the input file to be on the server, because it (as any other stored procedure) runs on the server; it has no access to your local file system.
Commands you run in the Db2 command line processor execute on the client and therefore can access the file locally.

Related

Datastage Oracle Importing Table

I have a error when i use datastage connect to Oracle and import the table definition. Below is the detailed situation.
enviroment:
OS:AIX6.1,64bit,power6 processor, LANG=en_US
Data Stage Version:8.5
Installation profile
three tiers install on same machine, repository use DB2 (default).
Oracle Client 11.2 (64bit) also install on this machine, I can use SQLPLUS connect to Oracle server (11.2, 64bit, AL32UTF8) on another machine.
"dsenv" setting
add "/oracle/product/11.2.0-64/lib" to the "LIBPATH"
add "export TNS_ADMIN=/oracle/product/11.2.0-64/network/admin"
Problem
1. I use Oracle Connector(parallel) create a Link, then use this Link import Metadata, when i press Test connection, there is a dialog with "The OCI function OraOCIEnvNlsCreate:OCI_UTF16ID returned status -1. Error code: NULL, Error message: NULL" popup, and the connection failed.
I use Oracle Enterprise(parallel) create a Link, then use it import Metadata, when i click Ellipsis button list all the tables in target database, there is a dialog with "cannot get list of table names from database" popup, after I click OK on this dialog, the detail error message popup.
12:37:21(002) Unable to access database oracleLibrary orchoracle could not be loaded; Could not load "orchoracle": 
0509-022 Cannot load module /opt/IBM/InformationServer/Server/DSComponents/bin/orchoracle.o.
0509-150 Dependent module /opt/IBM/InformationServer/Server/DSComponents/bin/libclntsh.so could not be loaded.
0509-103 The module has an invalid magic number.
0509-022 Cannot load module /opt/IBM/InformationServer/Server/DSComponents/bin/orchoracle.o.
0509-150 Dependent module /opt/IBM/InformationServer/Server/DSComponents/bin/orchoracle.o could not be loaded.
from the message I found the DS search some files in DSCompoments/bin, but these files are in the oracle bin directory. I can't find the error in dsenv file, so i copied these files into DSComponents/bin, this time the error message changed to "OCI_ERROR: Bad Oracle environment".
I am not sure which enviroment variable I missed, please tell me.
I use Oracle OCI(Server) create a Link and import a table, it works fine.
So, my question is why I can't use the Oracle Connector and Oracle Enterprise to connect the Oracle. Thanks.
Yes the PATH variable needs to be set to $ORACLE_HOME/bin. Adding this variable to the dsenv file and recycling all services fixed the Oracle COnnector issue for us. It is required to be added to the dsenv file and recycling ASBNode and datastage is also required. Here are the directives needed in the dsenv file to use Oracle Connector :(eg is from our system AIX 6.1 , datastage 8.5 connecting to Oracle 11g Enterprise)
We also added the following :
TNS_ADMIN=/opt/oracle/product/11.1.0/client_1/network/admin; export TNS_ADMIN

SQL DATABASE(postgresql)

ERROR: could not open file "C:\Users\lenovo\Downloads\Owners.csv" for reading: Permission denied
HINT: COPY FROM instructs the PostgreSQL server process to read a file. You may want a client-side facility such as psql's \copy.
SQL state: 42501
I am trying to import a csv file into postgresql. But this error pops up. I search everywhere. But i Couldn't get the answer of it PLEASE HELP ME.
THANKS IN ADVANCE!!
COPY mytable FROM /path/thefile.csv WITH CSV,HEADER; is executed by the DBMS server, the .csv-file is read by the server. The server (typically) runs as user postgres, which cannot access arbitrary users's files. (Also: the client and server don't have to be running on the same machine) There are two possible solutions to this:
copy the csv-file to a place where the server can access it, in /tmp/, or somewhere under its home-directory.
use psql's \copy mytable(col1,col2,...) FROM '/path/file.csv'... (slightly different syntax)

Export ixf in db2

EXPORT TO myFile.ixf OF ixf SELECT * FROM TABLE_NAME WHERE SSN='DATA' AND EMPLOYER_ID=DATA AND CREATED_TS='DATA'
I am using this statement to export a couple of rows. for privacy purposes DATA has been inserted where necessary. however the following error is produced. I have followed IBM's guide on export and feel like this should be correct but unsure exactly as to what is wrong. the error log is as follows
Error: DB2 SQL Error: SQLCODE=-104, SQLSTATE=42601, SQLERRMC=myFile;EXPORT TO ;JOIN, DRIVER=3.53.70
SQLState: 42601
ErrorCode: -104
As already remarked, you cannot directly run Db2-commands (such as import, export, load ... etc.) from plain SQL , as you are trying to do via JDBC.
Instead, if your Db2-server runs on Linux/Unix/Windows, you can either use a stored procedure, or (for any Db2-server operating system) you can use the command-line.
However, when you use stored-procedure SYSPROC.ADMIN_CMD for Db2-LUW, all file-names in stored-procedure parameters are relative to the Db2-server (and not your remote jdbc-client, if you are running remotely).
That means after a successful export via stored-procedure, if you really need the exported IXF file to be on your workstation then you must do file-transfer to your workstation using whatever tools you have for that purpose.
For example, this shows an export on Unix to an IXF file in /tmp on the Db2-server:
call sysproc.admin_cmd('EXPORT TO /tmp/myFile.ixf OF ixf SELECT * FROM user1.stk1 with ur') ;
If you don't want to use a stored procedure, you must use the command-line shell (for example on Windows, use db2ntcmd.bat , or on Unix use bash or ksh) and connect to the database in the shell and perform the export. This requires the workstation to have a Db2-client and also that the relevant database and node be first catalogued.
If you specify your Db2-version and the operating-system on which your Db2-server runs, then you will get more details.

Can db2 import or load be used to populate DashDB?

I'm looking to bulk loads millions of rows into a DashDB database. After connecting using the DB2 CLI, I enter a command like:
db2 import from rowsToImport.csv of del insert into MY_TABLE
with results:
SQL0551N "DASHXXX" does not have the required authorization or privilege to
perform operation "BIND" on object "NULLID.SQLUAJ19". SQLSTATE=42501
Is this an inherent limitation with DashDB, or is something configured incorrectly on my client? I get a similar message when trying db2 load:
SQL2019N An error occurred while utilities were being bound to the database.
p.s. I'm aware of the rest client api for DashDB for loading data - I'm asking specifically how/if bulk loads can be done with the DB2 command line as an alternate option.
As per dashDB documentation you can use the Command line processor plus (CLPPlus). It is included in the dashDB driver package and provides a command-line user interface that you can use to connect to the dashDB database, BLUDB. You can use CLPPlus to define, edit, and run statements, scripts, and commands. Please take also a look at Connecting CLPPlus to the dashDB database to see how to connect and use the CLI.
Please note that in CLPPlus: IMPORT, EXPORT and LOAD commands have a restriction that processed files must be on the server: see here. So you should copy the input load file onto the remote server first with SCP. However SSH/SCP protocol should be blocked (not accessible) for a normal dashDB user.
Only geospatial data can be loaded from your local machine to dashDB, using IDA LOADGEOSPATIALDATA command in CLPPlus.
The file to be loaded in dashDB using the above command can be in the local file system, accessible to the CLPPlus user.
Alternative ways to do that are:
dashDB REST API (as you already mentioned). See Load delimited data using the REST API and cURL.
load the csv directly from the dashDB dashboard on Bluemix. See Loading data from the desktop into IBM dashDB.
load the csv using IBM Data Studio. See dashDB large file load using IBM Data Studio.
According to this technote, the package NULLID.SQLUAJ19 belongs to one of the early DB2 10.1 fix packs, so I suspect your client version is 10.1. When attempting to execute the IMPORT command it needs to bind some packages of that older version, since dashDB is DB2 10.5, obvisouly.
You may want to try installing the latest DB2 client fix pack, as the necessary packages may be already bound in the database.
To verify that you could run select pkgname from syscat.packages where pkgschema = 'NULLID' and pkgname like 'SQLUA%' -- you should see "SQLUAK20", which seems to be the corresponding package in DB2 10.5.
If that doesn't work, your other option might be to move to a dedicated dashDB instance, as you won't have sufficient privileges to bind missing packages in the entry-level shared dashDB service.

Replace Existing File with Temp File:I/O Error

I have an Access 2007 database from which I call a Windows batch file to retrieve files from an external server via a ribbon menu. When executing the file manually everything works just fine. When executing the batch file via the Access ribbon menu the following error appears within the command line:
Opening data connection for ...
> Replace Existing File with Temp File:I/O Error
Binary transfer complete.
I've read something about this error in relation to (Admin) rights, but since the batch file actually runs when called by Access it not seems to be the issue.