Using SAS to insert records into DB2 database - db2

To give a background, I am using
- base SAS in mainframe (executed by JCL) and
- DB2 as the database.
I have the list of keys to read DB in a mainframe dataset. I understood that we can join a sas dataset with a DB2 table to read as follows.
%LET DSN=DSN;
%LET QLF=QUALIFIER;
PROC SQL;
CONNECT TO DB2(SSID=&DSN);
CREATE TABLE STAFFTBL AS
(SELECT * FROM SASDSET FLE,
CONNECTION TO DB2
(SELECT COL1, COL2, COL3
FROM &QLF..TABLE_NAME)
AS DB2 (COL1, COL2, COL3)
WHERE DB2.COL1 = FLE.COL1);
DISCONNET FROM DB2;
%PUT &SQLXMSG;
QUIT;
can someone suggest me, if I have a dataset with list of values to be inserted in a mainframe dataset, how should we proceed.
We can read the mainframe dataset and get the values in a SAS dataset. But I am not able to guess on how to use the sas dataset to insert values to DB2.
I know we can do it using COBOL. But I am willing to learn if it is possible using SAS.
Thanks!
Solution:
Have to assign library to write to DB. Please refer to the SAS Manual here

Your above query creates a local SAS dataset in the Work library or wherever your default library is declared. This table is not connected to your backend DB2 database but simply a copy used as import into SAS.
Consider establishing a live connection using an ODBC SAS library. If not ODBC, use the DB2 API SAS has installed. Once connected all tables in specified database will emerge as available SAS datasets in a SAS library and these are not imported copies but live tables. Then run any proc sql append or use proc.append to insert records to table from SAS.
Below are generic examples with DSN or non-DSN which you can modify according to your credentials or database driver type.
* WITH DSN;
libname DBdata odbc datasrc="DSN Name" user="username" password="password";
* WITH DRIVER (NON-DSN) - CHECK DRIVER INSTALLATION;
libname DBdata odbc complete="driver=DB2 Driver; Server=servername;
user=username; pwd=password; database=databasename;";
Append procedures:
* WITH SQL;
proc sql;
INSERT INTO DBdata.tableName (col1, col2, col3)
SELECT col1, col2, col3 FROM SASDATASET;
quit;
* WITH APPEND (ASSUMING COLUMNS MATCH TOGETHER);
proc datasets;
append base = DBdata.tableName
data = SASDATASET
force;
quit;
NOTE: Be very careful not to unintentionally add, modify, or delete any table in the SAS ODBC library as these datasets are live tables, so such changes will reflect in backend DB2 database. When finished with work, do not delete the library (or all tables will be cleaned out), simply unassign it from environment:
libname DBdata clear;

Provided that you have the necessary write access, you should be to do this via a proc sql insert into statement. Alternatively, if you can access the db2 table via a library, it may be possible to use a data step with both a modify and an output / replace statement.

Related

Querying a PostgreSQL database from Snowflake

PostgreSQL offers a way to query a remote database through dblink.
Similarly (sort-of), Exasol provides a way to connect to a remote Postgres database via the following syntax:
CREATE CONNECTION JDBC_PG
TO 'jdbc:postgresql://...'
IDENTIFIED BY '...';
SELECT * FROM (
IMPORT FROM JDBC AT JDBC_PG
STATEMENT 'SELECT * FROM MY_POSTGRES_TABLE;'
)
-- one can even write direct joins such as
SELECT
t.COLUMN,
r.other_column
FROM MY_EXASOL_TABLE t
LEFT JOIN (
IMPORT FROM JDBC AT JDBC_PG
STATEMENT 'SELECT key, other_column FROM MY_POSTGRES_TABLE'
) r ON r.key = t.KEY
This is very convenient to import data from PostgreSQL directly into Exasol without having to use a temporary file (csv, pg_dump...).
Is it possible to achieve the same thing from Snowflake (querying a remote PostgreSQL database from Snowflake with a direct live connection)? I couldn't find any mention of it in the documentation.
Have you looked into using external functions? It's not exactly what you're looking for (Snowflake doesn't have that capability yet) but it can be used as a workaround in some use cases. For instance, you could create a Python function on AWS Lambda that queries PostgreSQL for small amounts of data (due to Lambda limits) or have it trigger a PostgreSQL process that dumps to S3 to trigger Snowpipe for the bulk import use case.

anybody has a sample code to import a SAS table into postgresql

I am looking to import a SAS table sas7dat format into postgresql. Or do i need to convert the table into csv then import?
Thank you!
If you have already SAS and Postgres at your organization then you probably you might have SAS/Access interface to Postgres. you can use proc setinit;run; to check whether you have SAS/Access interface to Postgres. If you have the SAS/ACCESS then you can use libname method as shown in below example.
/* define libname for postgres*/
libname A1 postgres server=mysrv1 port=5432 user=myusr1 password='mypwd1' database=mydb1;
/* define libname for SAS table*/
libname mydata '/folders/myfolders/';
/* then use datastep or SQL to create your postgress table*/
data A1.yourtable;
set mydata.yourtable;
run;
If you do not have SAS/ACCESS to postgres then you may have to do in 2 steps.(but check whether you have any etl tools available in your company)
first you have to use proc export to CSV. see the link below
Efficiently convert a SAS dataset into a CSV
then move csv data into postgres
How to import CSV file data into a PostgreSQL table?
I found another solution which requires Python to import sas7bdat file into Postgres.
import pandas as pd
from sqlalchemy import create_engine
engine = create_engine("postgresql://user:password#localhost:5432/databasename", echo=False)
df = pd.read_sas('sas7bdat file location')
df.to_sql('tablename', con=engine, if_exists='replace')
Another option might be to import the SAS file into R, then from R write the data to a SQL database using dbWrite.
Personally, I have found that getting data that is in R into a SQL database is easier than trying to get data from a text file into a SQL database.
See https://www.statology.org/import-sas-into-r/
https://dbi.r-dbi.org/reference/dbwritetable

Merging two data sets by ID in SAS (Data sets inside Library)

Okay so I have two data sets, one is called Customer and other other CustomerOrder. They are linked by 'CustomerID.' I have both data sets in a sas library referenced as 'NewData.' I want to know how you would write the code to merge these two tables using the library reference and the CustomerID in both sets?
Thank you!
This would be the merge in Proc Sql (creating a temporary table called tempCust):
proc sql noprint;
create table work.tempCust as
select * from
NewData.CustomerOrder co, NewData.Customer cust
Where co.CustomerID = cust.CustomerID;
quit;

Write from SAS Table to DB2 Temp Table

I have a local table in SAS that I am trying to create as a temporary table table on a remote DB2 server. Is there anyway to do this other than build an insert statement elsewhere and stream it?
libname temp db2 uid=blagh pwd=blagh dsn=blagh connection=global schema=Session;
Proc SQL;
Connect to db2 (user=blagh pw=blagh dsn=blagh connection=global);
Execute (
Declare Global Temporary Table Session.Test
( foo char(10))
On Commit Preserve Rows
Not Logged
) by db2;
Execute (Commit) by db2;
Insert Into Session.Test
Select Distinct A.foo From Work.fooSource A;
I have tried several variations on these theme, each resulting in errors. The above code produces.
ERROR: Column foo could not be found in the table/view identified with the correlation name A.
ERROR: Unresolved reference to table/correlation name A.
Removing the alias gives me.
ERROR: INSERT statement does not permit correlation with the table being inserted into.
A pass-through statement like below should work.
proc sql;
connect to db2 (user=blagh pw=blagh dsn=blagh connection=global);
execute (create view
sasdemo.tableA as
select VarA,
VarB,
VarC
from sasdemo.orders)
by db2;
execute
(grant select on
sasdemo.tableA to testuser)
by db2;
disconnect from db2;
quit;
The code below is what I routinely use to upload to DB2
rsubmit YourServer;
libname temp db2 uid=blagh pwd=blagh dsn=blagh connection=global schema=Session;
data temp.Uploaded_table(bulkload = yes bl_method = cliload);
set work.SAS_Local_table;
run;
endrsubmit;
libname temp remote server=YourServer;
More options for DB2 are available from SAS support... http://support.sas.com/documentation/onlinedoc/91pdf/sasdoc_913/access_dbspc_9420.pdf
I don't know db2 so I don't know for sure this works, but the 'normal' way to do this is with PROC COPY (although the data step also should work). I would guess in your code above that db2 doesn't allow inserts that way (I think that's fairly common to not be supported in SQL flavors).
libname temp db2 uid=blagh pwd=blagh dsn=blagh connection=global schema=Session;
proc copy in=work out=temp;
select work.foosource;
run;
If you need the name to be different (in PROC COPY it won't be), you can do a simple data step.
data temp.yourname;
set work.foosource;
run;
You shouldn't need to do the inserts in SQL. If you want to first declare it in db2 (in a connect to... session) you probably can do that and still do either of these options (though again this varies some based on the RDBMS, so test this).

DB2 compound statement using ADO.NET

I want to execute multiple statements from my data access layer using C# dan IBM's DB2 data provider.
(environment: DB2/AS400 os version: V5R4)
eg in TSQL:
declare varA integer;
select varA= count(*) from tableA;
select * from tableB where col1 <= varA
with SQL server ; I can concatenate those 3 statements into a string
and assign the text to DBCommand.CommandText.
How to execute multiple statements(compound statement) against DB2 database via DBCommand (using IBM DB2 data provider)
I tried using begin and end block but still failed
BEGIN
statement1;
statement2;
statement3;
END
Thank you
I do not think it's possible.
I had already tried something similar some time ago, and the only solution I found is to dynamically create a stored procedure, calling it, and finally delete it.