Get Calling stored procedure's name - find

IBM DB2 LUW. I have stored procedures calling stored procedures (nesting). Is it possible in a nested stored procedure (called) to get the name of its parent stored procedure (caller) at run time?

This is not possible, not with extra programming anyway. If you really need to know the caller name, you could pass it to the callee as an extra parameter.
In recent versions of DB2 there is a special global variable, ROUTINE_SPECIFIC_NAME, that contains the specific name of the currently executing routine, so in the caller routine you might call the other SP like so: CALL SP2(ROUTINE_SPECIFIC_NAME, <other parameters>...).

Related

call program in interactive sql as400

Is there a way to call a program from db2 interactive SQL in as400 (strsql)? this program receives an argument by reference and modify it's content. In CL, you simply call it like this:
call myprogram 12345
I need to be able to call it in interactive SQL, Is there any way or workaround to do this? like launching an OS command? for example in C you do system("your system command"). I couldn't find anything related to it.
STRSQL supports the SQL CALL statement.
The best option is to define the program as External SQL Stored procedure
--note
----- numeric-->zoned decimal
----- decimal-->packed decimal
CREATE PROCEDURE MYLIB.MYPROGRAM_SP
(IN number numeric(5,0))
LANGUAGE RPGLE
EXTERNAL NAME 'MYLIB/MYPROGRAM'
PARAMETER STYLE GENERAL;
Then you can
CALL MYLIB.MYPROGRAM_SP(12345)
Technically, every *PGM object on the IBM i is a stored procedure. You can call it without explicitly defining it as shown above. But assumptions are made about the parms in that case. It's much better to provide the DB with the interface definition.
Note that STRSQL is a 20 year old tool, it has various limitations including not supporting OUT or INOUT parameters of stored procedures.
A much better choice is to use the Run SQL Scripts component of IBM's Access Client Solutions (ACS)

Can i able to make Trigger in Cache for mapped global?

I need to create a trigger function that would be called whenever i insert or delete data to my table.
Internally cache keeps the data in a global.
in the reverse way i can add data directly to the Global and i can view it in the table.
The trigger function works fine when i insert data using SQL Statement(Insert into).
But it fails to call when i add directly to the global.
so,how can i make triggers to be called when i add data to the Global directly. instead adding it using query (Insert into TABLE).
If you use the class to add data in the global, then you can use the Callback methods. For example %OnAfterSave does what you want.
On the other hand if you put data directly into the global then you will need some way to track when data is added. You can do this by writing your own agent or by doing what is adviced in this post: How can i make copy of a global automatically in my local system?
(this is the link referenced in that answer) http://docs.intersystems.com/cache20141/csp/docbook/DocBook.UI.Page.cls?KEY=GCDI_journal#GCDI_journal_util_ZJRNFILT

SSIS Access DB data into TSQL with a stored procedure

I want to use a Data Flow Task that has two sources one from an Access DB and one from a SQL Server DB. I then want to manipulate the data and finally call a stored procedure on the same SQL Server that was my second source.
So I can't use an Execute SQL Task since I want to manipulate the data and call the stored procedure at the end.
What toolbox component would I use and what format would the stored procedure call?
I tried to do an OLE DB Destination with a stored procedure called something like this.
Exec sppUpdateAATable1 ?,?
SSIS uses concept of a pipeline to organise DataFlow task. Data flows from source to destination and processing of this data happens in between. Since you want to use the result of processing in your stored procedure as parameters, it cannot be done under pipeline concept. SP is not really a destination for your data, as SP will do something with it too.
You can populate an in-memory Recordset (destination), and use ForEach loop container to execute your SP for each row of recordset.
Update
You package should look something like this:
Data Flow task:
OLE DB connection to Access
OLE DB connection to SQL Server
To combine 2 data streams use UNOIN task
Record set destination, in properties you name a variable of type Object (MyRecordsetVar). It will hold recordset data.
ForEach Loop Container. In properties select type of loop container - ADO Recorset, specify MyRecordsetVar variable in loop properties.
Assign two more (or as many as needed) variables to hold data from each column of the recordset. Data from each row of the recordset willbe passed to these variables. One row at a time.
Inside the loop put Execute SQL task. In Input "menu" of the task specify your INPUT variables - those that have data from columns of recordset. I would assume that you know how to do it.
Put your query into the task as execute sp_MyProc ?,?.
This should be it.
You can save yourself the trouble of the recordset destination and foreach loop route and instead use an OLE DB Command in your Data Flow. It's going to fire that stored proc for each row that flows through it.

How do you specify a local database instance in TSQL with the USE keyword?

I have several database names which exist on local, dev and live servers.
I want to ensure a potentially dangerous T-SQL script will always use the local db and not any other db by accident.
I can't seem to use the [USE] keyword with the local instance name followed by the db name.
It seems pretty trivial but I can't seem to get it to work.
I've tried this but no luck:
USE [MYMACHINE/SQLEXPRESS].[DBNAME]
The instance is going to be determined through your connection/connection string. You connect to a specific instance and then all subsequent T-SQL will be executed against that instance and that instance alone.
The current answer is not correct for the question asked. As you can specify a specific LocalDB file via the USE command in T-SQL. You just have to specify the fully qualified path name, which is what you will also see in the dropdown for the database list.
USE [C:\MyPath\MyData.mdf]
GO

Eclipse BIRT and Oracle: Need to set role before running report

Is it possible to set a database role before running a report? I have a number of databases each containing a number of schemas with the same set of tables, where each schema has a number of roles to control read, write, data management and so on. None of these are default roles.
In sqlplus or TOAD I can do SET ROLE , before running a select statement. I would like to do the same in BIRT.
It may be possible to do this using the afterOpen event for the ODA Data Source, but I have not found any examples on how to get and use the native connection in JavaScript.
I am not allowed to add or change anything on the server end.
You can make an additional call to the database in the afterOpen method of the Data Source using Java. You can use JavaScript or a Java Event Handler to execute the SET ROLE statement, or to call a stored procedure that will execute it for you. This happens after the initial db connection is made, but before the Data Set query runs. It will be a little tricky to use the data source connection to make that call however, and I don't have the code right now to provide as an example.
Another way is to create a stored proc Data Set that will execute the desired command, and have that execute first. Drag and drop the Data Set into the report design, and make it invisible. It will run first before any other queries. Not the cleanest solution, but easy to do
Hope that helps
Le Birt Expert
You can write a login trigger and do a set role in this trigger ( PL/SQL: DBMS_SESSION.SET_ROLE). You can determine the username, osuser, program and machine of the user who want to log in.
The approach to use a stored procedure for setting the role won't work - at least not on Apache Derby. Reason: lifetime of the set role is limited to the execution of the procedure itself - after returning from the procedure the role will be the same as before the procedure has been called, i.e. for executing the report the same as no role would have ever been set.