I have a shared library which containts implementation(language C) of some utility db2 procedures and functions. Now i want to call these utility functions from db2cmd as,
select epiProcLibVer() from sysibm.sysdummy1
To make this these utiliy functions installed into DB2 db server, i have placed this shared library (libDB2CLIWrapper.so) into <DB2InstallFolder>/sqllib/function
I have restarted the db2 instace to pick and load this library into shared memory of the db server.
Still i am getting the below error # db2 cmd as below:
db2 => select epiproclibver() from sysibm.sysdummy1
SQL0440N No authorized routine named "EPIPROCLIBVER" of type "FUNCTION"
having compatible arguments was found. SQLSTATE=42884
Now i would like to know whether is it the correct procedure to make a shared library load into db2 db server? how to access the functions in this shared library from sql query?
this may be a starting point?
CREATE FUNCTION epiproclibver () RETURNS INT
EXTERNAL NAME 'libDB2CLIWrapper!epiProcLibVer'
NOT FENCED
SCRATCHPAD
VARIANT
NO EXTERNAL ACTION
LANGUAGE C PARAMETER STYLE DB2SQL NO SQL:
Related
On DB2 AIX I can use the SYSPROC.ADMIN_REVALIDATE_DB_OBJECTS stored procedure to revalidate all Stored Procedures and Functions defined in my schema.
How can I do the same thing on DB2 z/OS (v.12)?
Thanks
REGENERATE automatically rebinds, at the local server, the package for the SQL control statements for the procedure and rebinds the package for the SQL statements that are included in the procedure body. If a remote bind is also needed, the BIND PACKAGE COPY command must be explicitly done for all of the remote servers.
ALTER PROCEDURE SCHEMA.NAME_SP REGENERATE ACTIVE VERSION;
For the moment I have not found anything that automatically revalidates/regenerates all Stored/UDFs of a Schema.
EXPORT TO myFile.ixf OF ixf SELECT * FROM TABLE_NAME WHERE SSN='DATA' AND EMPLOYER_ID=DATA AND CREATED_TS='DATA'
I am using this statement to export a couple of rows. for privacy purposes DATA has been inserted where necessary. however the following error is produced. I have followed IBM's guide on export and feel like this should be correct but unsure exactly as to what is wrong. the error log is as follows
Error: DB2 SQL Error: SQLCODE=-104, SQLSTATE=42601, SQLERRMC=myFile;EXPORT TO ;JOIN, DRIVER=3.53.70
SQLState: 42601
ErrorCode: -104
As already remarked, you cannot directly run Db2-commands (such as import, export, load ... etc.) from plain SQL , as you are trying to do via JDBC.
Instead, if your Db2-server runs on Linux/Unix/Windows, you can either use a stored procedure, or (for any Db2-server operating system) you can use the command-line.
However, when you use stored-procedure SYSPROC.ADMIN_CMD for Db2-LUW, all file-names in stored-procedure parameters are relative to the Db2-server (and not your remote jdbc-client, if you are running remotely).
That means after a successful export via stored-procedure, if you really need the exported IXF file to be on your workstation then you must do file-transfer to your workstation using whatever tools you have for that purpose.
For example, this shows an export on Unix to an IXF file in /tmp on the Db2-server:
call sysproc.admin_cmd('EXPORT TO /tmp/myFile.ixf OF ixf SELECT * FROM user1.stk1 with ur') ;
If you don't want to use a stored procedure, you must use the command-line shell (for example on Windows, use db2ntcmd.bat , or on Unix use bash or ksh) and connect to the database in the shell and perform the export. This requires the workstation to have a Db2-client and also that the relevant database and node be first catalogued.
If you specify your Db2-version and the operating-system on which your Db2-server runs, then you will get more details.
I am currently trying to use a dashDB database with the db2cli utility and ODBC (values are from Connect/Connection Information on the dashDB web console). At this moment I can perfectly do SELECT or INSERT statements and fetch data from custom tables which I have created, thanks to the command:
db2cli execsql -connstring "DRIVER={IBM DB2 ODBC DRIVER - IBMDBCL1}; DATABASE=BLUDB; HOSTNAME=yp-dashdb-small-01-lon02.services.eu-gb.bluemix.net; PORT=50000; PROTOCOL=TCPIP; UID=xxxxxx; PWD=xxxxxx" -inputsql /tmp/input.sql
Now I am trying to do a DB2 LOAD operation through the db2cli utility, but I don't know how to proceed or even if it is possible to do so.
The aim is to import data from a file without cataloging the DB2 dashDB database on my side, but only through ODBC. Does someone know if this kind of operation is possible (with db2cli or another utility)?
The latest API version referenced from the DB2 on Cloud (ex DashDB) dashboard is available here. It requires first to call the /auth/tokens endpoint to generate an auth token based on your Bluemix credentials to be used to authorize the API calls.
I've published recently a npm module - db2-rest-client - to simplify the usage of these operations. For example, to load data from a .csv file you can use the following commands:
# install the module globally
npm i db2-rest-client -g
# call the load job
export DB_USERID='<USERID>'
export DB_PASSWORD='<PASSWORD>'
export DB_URI='https://<HOSTNAME>/dbapi/v3'
export DEBUG=db2-rest-client:cli
db2-rest-client load --file=mydata.csv --table='MY_TABLE' --schema='MY_SCHEMA'
For the load job, a test on Bluemix dedicated with a 70MB source file and about 4 million rows took about 4 minutes to load. There are also other CLI options as executing export statement, comma separated statements and uploading files.
This is not possible. LOAD is not an SQL statement, therefore it cannot be executed via an SQL interface such as ODBC, only using the the DB2 CLP, which in turn requires a cataloged database.
ADMIN_CMD() can be invoked via an SQL interface, however, it requires that the input file be on the server -- it won't work with a file stored on your workstation.
If JDBC is an option, you could use the CLPPlus IMPORT command.
You can try loading data using REST API.
Example:
curl --user dashXXX:XXXXXX -H "Content-Type: multipart/form-data" -X POST -F loadFile1=#"/home/yogesh/Downloads/datasets/order_details_0.csv" "https://yp-dashdb-small-01-lon02.services.eu-gb.bluemix.net:8443/dashdb-api/load/local/del/dashXXX.ORDER_DETAILS?hasHeaderRow=true×tampFormat=YYYY-MM-DD%20HH:MM:SS.U"
I have used the REST API and have not seen any size limitations. In ver 1.11 of dashDB local (warehouse db) external tables have been included. As long as file is on the container it can be loaded. Also the DB2 Load locks the table until load is finished where a external table load won't
There are a number of ways to get data into Db2 Warehouse on Cloud. From a command line you can use Lift CLI https://lift.ng.bluemix.net/ which provides the best performance for large data sets
You can also use EXTERNAL TABLEs https://www.ibm.com/support/knowledgecenter/ean/SS6NHC/com.ibm.swg.im.dashdb.sql.ref.doc/doc/r_create_ext_table.html which are also high performance and have lots of options
This is a quick example using a local file (not on the server) hence the REMOTESOURCE YES option
db2 "create table foo(i int)"
echo "1" > /tmp/foo.csv
db2 "insert into foo select * from external '/tmp/foo.csv' using (REMOTESOURCE YES)"
db2 "select * from foo"
I
-----------
1
1 record(s) selected.
for large files, you can use gzip, either on the fly
db2 "insert into foo select * from external '/tmp/foo.csv' using (REMOTESOURCE GZIP)"
or from gzip'ed files
gzip /tmp/foo.csv
db2 "insert into foo select * from external '/tmp/foo2.csv.gz' using (REMOTESOURCE YES)"
I am attempting to migrate a legacy client-server GIS application from Microsoft SQL Server to PostgreSQL. In this system the SQL Server database contains a large set of T-SQL stored procedures, each being an SQL Select query with one or more parameters.
In response to a client request, the server program – classic ASP VBScript – uses Microsoft ADODB to fill in the parameters values, execute the requested query and return the result set as an XML document to the client.
In the preliminary phase of the migration I have successfully:
installed PostgreSQL 9.3 for Windows, PgOleDB 1.0.0.20 and psqlODBC on a 32.bit Windows 7 Pro development PC
migrated the SQL Server database table definitions and constraint to a new PostgreSQL database and populated the tables using psqlODBC
using pgAdmin III, created and executed test queries against the PostgreSQL database
using pgAdmin III, created two test PostgreSQL table-valued Select functions – test1() one with no parameters and test2(text) with a single IN parameter — and verified that both execute correctly
written a test ASP/VBscript program that uses ADODB.Connection and ADODB.Command to connect to the PostgreSQL database, execute test1() - the no-parameter stored function - and create an ADODB.Recordset. This test works correctly.
However when I change my test ASP/VBscript program to use test2() and use objCmd.Parameters.Append like this:
objCmd.CommandText = "test2"
objCmd.Parameters.Append objCmd.CreateParameter("p1", adVarChar, adParamInput, 15, "000007-01012013")
to specify the required IN parameter, I get the ASP runtime error:
PgOleDB error '80004005' “Procedure name for automatic arguments is not unique“
Q1. What does the diagnostic “Procedure name for automatic arguments is not unique” mean?
Q2. Is there any public documentation available for PgOleDB other than the README.TXT and RELEASENOTES.TXT files installed along with PgOleDB.dll?
I would like to use a shared library I produced from db2 queries.
My shared library depends on boost that, in my machine, is locate in /usr/local/lib.
When I try to run the queries using my functions I got errors: they dosn't work because db2 cannot find boost libraries i.e. cannot resolve the location of the libraries.
How to tell db2 where to locate the libraries path and which enviromnent variable s I should use?
I tried to ise userprofile and profile.env but without success.
# userprofile
LIBPATH=/usr/local/lib:LIBPATH
#profile.env
DB2ENVLIST='LIBPATH ..other stuf'
Let me see if I understand your question correctly. You have C or C++ external UDFs that depend on some other libraries. If so, I think you should set the DB2 registry variables, not your environment variables:
export LD_LIBRARY_PATH=/usr/local/lib:$LIBPATH # this must be in the global profile
db2set DB2LIBPATH=$LD_LIBRARY_PATH
db2set DB2ENVLIST="LD_LIBRARY_PATH otherstuff"
The LD_LIBRARY_PATH variable must be set in the environment for the instance owner user and the DB2 fenced user, because the external routines run under one of these two. Probably the best way to do it is to set /etc/profile. This should be done before executing the db2set commands.
After setting the registry variables using db2set you must restart the DB2 instance (db2stop force then db2start).