Write from SAS Table to DB2 Temp Table - db2

I have a local table in SAS that I am trying to create as a temporary table table on a remote DB2 server. Is there anyway to do this other than build an insert statement elsewhere and stream it?
libname temp db2 uid=blagh pwd=blagh dsn=blagh connection=global schema=Session;
Proc SQL;
Connect to db2 (user=blagh pw=blagh dsn=blagh connection=global);
Execute (
Declare Global Temporary Table Session.Test
( foo char(10))
On Commit Preserve Rows
Not Logged
) by db2;
Execute (Commit) by db2;
Insert Into Session.Test
Select Distinct A.foo From Work.fooSource A;
I have tried several variations on these theme, each resulting in errors. The above code produces.
ERROR: Column foo could not be found in the table/view identified with the correlation name A.
ERROR: Unresolved reference to table/correlation name A.
Removing the alias gives me.
ERROR: INSERT statement does not permit correlation with the table being inserted into.

A pass-through statement like below should work.
proc sql;
connect to db2 (user=blagh pw=blagh dsn=blagh connection=global);
execute (create view
sasdemo.tableA as
select VarA,
VarB,
VarC
from sasdemo.orders)
by db2;
execute
(grant select on
sasdemo.tableA to testuser)
by db2;
disconnect from db2;
quit;
The code below is what I routinely use to upload to DB2
rsubmit YourServer;
libname temp db2 uid=blagh pwd=blagh dsn=blagh connection=global schema=Session;
data temp.Uploaded_table(bulkload = yes bl_method = cliload);
set work.SAS_Local_table;
run;
endrsubmit;
libname temp remote server=YourServer;
More options for DB2 are available from SAS support... http://support.sas.com/documentation/onlinedoc/91pdf/sasdoc_913/access_dbspc_9420.pdf

I don't know db2 so I don't know for sure this works, but the 'normal' way to do this is with PROC COPY (although the data step also should work). I would guess in your code above that db2 doesn't allow inserts that way (I think that's fairly common to not be supported in SQL flavors).
libname temp db2 uid=blagh pwd=blagh dsn=blagh connection=global schema=Session;
proc copy in=work out=temp;
select work.foosource;
run;
If you need the name to be different (in PROC COPY it won't be), you can do a simple data step.
data temp.yourname;
set work.foosource;
run;
You shouldn't need to do the inserts in SQL. If you want to first declare it in db2 (in a connect to... session) you probably can do that and still do either of these options (though again this varies some based on the RDBMS, so test this).

Related

How to join tables in two Firebird databases?

Currently I'm working on a simple library project using Embarcadero C++Builder 10.3 Community Edition, and Firebird and FlameRobin to create databases.
So far, I need only use simple queries, that were connected to a single database. Therefore, I used TFDConnection and TFDPhysFbDriverLink to connect to a .fdb file. Then, TFDQuery to create SQL commands and TDataSource. It works great.
Unfortunately, now I must join two tables. How do I write this command? I tried this:
SELECT * FROM users_books
join books on
users_books.id_book = books.id
where users_books and books are databases.
I got an error:
SQL error code = -204
Table unknown
BOOKS.
So I think I must connect somehow to these two databases simultaneously. How to do that?
Firebird databases are isolated and don't know about other databases. As a result, it is not possible to join tables across databases with a normal select statement.
What you can do, is use PSQL (Procedural SQL), for example in an EXECUTE BLOCK. You can then use FOR EXECUTE STATEMENT ... ON EXTERNAL to loop over the table in the other database, and then 'manually' join the local table using FOR SELECT (or vice versa).
For example (assuming a table user_books in the remote database, and a table books in the current database):
execute block
returns (book_id integer, book_title varchar(100), username varchar(50))
as
begin
for execute statement 'select book_id, username from user_books'
on external 'users_books' /* may need AS USER and PASSWORD clause as well */
into book_id, username do
begin
for select book_title from books where id = :book_id
into book_title do
begin
suspend;
end
end
end

How to use a subquery as a database name in a DDL command?

I am wondering if it's possible to use the result of a subquery as database name in a PostgreSQL (9.5.1) DDL statement.
For example, I wanted to alter the current database with something like:
ALTER DATABASE (SELECT current_database()) SET a_var TO 'a_value';
If I run this, an error occurs:
ERROR: syntax error at or near "("
LINE 1: ALTER DATABASE (SELECT current_database()) SET ...
What's the correct way to use the sub-query (if possible)?
You need dynamic SQL for that:
DO
$do$
BEGIN
EXECUTE format($f$ALTER DATABASE %I SET x.a_var TO 'a_value'$f$, current_database());
END
$do$;
Using format() to escape the db name safely while being at it.
BTW, to unset:
ALTER DATABASE your_db RESET x.a_var;
To see the current setting:
SELECT current_setting('x.a_var');
(The DB default is not active before you start a new session.)
Related:
Table name as a PostgreSQL function parameter
Error when setting n_distinct using a plpgsql variable

Using SAS to insert records into DB2 database

To give a background, I am using
- base SAS in mainframe (executed by JCL) and
- DB2 as the database.
I have the list of keys to read DB in a mainframe dataset. I understood that we can join a sas dataset with a DB2 table to read as follows.
%LET DSN=DSN;
%LET QLF=QUALIFIER;
PROC SQL;
CONNECT TO DB2(SSID=&DSN);
CREATE TABLE STAFFTBL AS
(SELECT * FROM SASDSET FLE,
CONNECTION TO DB2
(SELECT COL1, COL2, COL3
FROM &QLF..TABLE_NAME)
AS DB2 (COL1, COL2, COL3)
WHERE DB2.COL1 = FLE.COL1);
DISCONNET FROM DB2;
%PUT &SQLXMSG;
QUIT;
can someone suggest me, if I have a dataset with list of values to be inserted in a mainframe dataset, how should we proceed.
We can read the mainframe dataset and get the values in a SAS dataset. But I am not able to guess on how to use the sas dataset to insert values to DB2.
I know we can do it using COBOL. But I am willing to learn if it is possible using SAS.
Thanks!
Solution:
Have to assign library to write to DB. Please refer to the SAS Manual here
Your above query creates a local SAS dataset in the Work library or wherever your default library is declared. This table is not connected to your backend DB2 database but simply a copy used as import into SAS.
Consider establishing a live connection using an ODBC SAS library. If not ODBC, use the DB2 API SAS has installed. Once connected all tables in specified database will emerge as available SAS datasets in a SAS library and these are not imported copies but live tables. Then run any proc sql append or use proc.append to insert records to table from SAS.
Below are generic examples with DSN or non-DSN which you can modify according to your credentials or database driver type.
* WITH DSN;
libname DBdata odbc datasrc="DSN Name" user="username" password="password";
* WITH DRIVER (NON-DSN) - CHECK DRIVER INSTALLATION;
libname DBdata odbc complete="driver=DB2 Driver; Server=servername;
user=username; pwd=password; database=databasename;";
Append procedures:
* WITH SQL;
proc sql;
INSERT INTO DBdata.tableName (col1, col2, col3)
SELECT col1, col2, col3 FROM SASDATASET;
quit;
* WITH APPEND (ASSUMING COLUMNS MATCH TOGETHER);
proc datasets;
append base = DBdata.tableName
data = SASDATASET
force;
quit;
NOTE: Be very careful not to unintentionally add, modify, or delete any table in the SAS ODBC library as these datasets are live tables, so such changes will reflect in backend DB2 database. When finished with work, do not delete the library (or all tables will be cleaned out), simply unassign it from environment:
libname DBdata clear;
Provided that you have the necessary write access, you should be to do this via a proc sql insert into statement. Alternatively, if you can access the db2 table via a library, it may be possible to use a data step with both a modify and an output / replace statement.

Stored procedure get column information does not return anything?

I am using entity framework with a stored procedure, in which I am generating query dynamically and executing that query. The stored proc query looks like:
Begin
DECLARE #Query nvarchar(MAX)
SET #Query = 'SELECT e.id, e.name, e.add, e.phno from employee'
EXEC sp_executesql #Query
End
In above sql code you can see that i am executing '#Query' variable, and that variable value can be changed dynamically.
I am able to add my stored proc in my edmx file. and then I go to model browser and say Add function import and try to Get column information it does not show anything. but when I execute my stored proc at server it returns all columns with values. Why i am not getting column information at model browser?
The model browser isn't running the stored procedure to then gather the column information from its result - it's trying to grab the column information from the underlying procedure definition using the sys tables.
This procedure, because it's dynamic, will not have an underlying definition and therefore won't be importable into the EDMX like this.
Temporarily change your stored proc to
SELECT TOP 1 e.id, e.name, e.add, e.phno from employee /* ... rest of your proc commented out */
Then add it to the EF and create the function import (it will see the columns).
Then change the proc back to how you had it above.
Try adding SET NOCOUNT ON after BEGIN.... that supresses messages that might cause it to be "confused"

DB2 compound statement using ADO.NET

I want to execute multiple statements from my data access layer using C# dan IBM's DB2 data provider.
(environment: DB2/AS400 os version: V5R4)
eg in TSQL:
declare varA integer;
select varA= count(*) from tableA;
select * from tableB where col1 <= varA
with SQL server ; I can concatenate those 3 statements into a string
and assign the text to DBCommand.CommandText.
How to execute multiple statements(compound statement) against DB2 database via DBCommand (using IBM DB2 data provider)
I tried using begin and end block but still failed
BEGIN
statement1;
statement2;
statement3;
END
Thank you
I do not think it's possible.
I had already tried something similar some time ago, and the only solution I found is to dynamically create a stored procedure, calling it, and finally delete it.