Trigger to PostgreSQL JDBC writes successfuly first time, fails after - postgresql

I have a project for sending data from DB2 on an IBM system over to a PostgreSQL server on a RHEL system. I am using a trigger that sends information to a data queue, which then gets read and sent over to the PostgreSQL server using a SQL statement through a JDBC connection on RPGLE.
The code is (more or less) as follows (I had to remove actual column and table names for security reasons):
dcl-proc doPostgreConn export;
dcl-pi doPostgreConn like(connection) end-pi;
//Code to change and set CLASSPATH variable, if necessary is here
//...
prop = JDBC_Properties();
JDBC_setProp(prop: 'user' : 'user');
JDBC_setProp(prop: 'password' : 'password');
JDBC_setProp(prop: 'databaseName' : 'database');
JDBC_setProp(prop: 'loggerLevel' : 'TRACE' );
JDBC_setProp(prop: 'loggerFile' : '/home/PostgreSQL/log');
pgconn = JDBC_ConnProp('org.postgresql.Driver'
:'jdbc:postgresql://[some_IP]:5432/database'
: prop );
JDBC_freeProp(prop);
return pgconn;
end-proc;
dcl-proc doPGWriteMyTable export;
dcl-pi doPGWriteMyTable like(success);
i#schm char(10);
i#rec char(334);
end-pi;
dcl-ds record extname('MYTABLE') end-ds;
dcl-s prepStmtTxt varchar(10000);
record = i#rec;
pgconn = doPostgreConn;
if pgconn = *NULL;
//Custom Error Handling
endif;
prepStmtTxt = 'INSERT INTO ' + %trim(i#schm) + '.MYTABLE ' +
' VALUES (?, ?, ?) ';
if PGWriteMYTABLEPrep = *NULL;
PGWriteMYTABLEPrep = JDBC_PrepStmt(pgconn:prepStmtTxt);
if PGWriteMYTABLEPrep = *NULL;
endif;
endif;
JDBC_setString (PGWriteMYTABLEPrep: 1: StrCol);
JDBC_setDecimal (PGWriteMYTABLEPrep: 2: DecCol);
JDBC_setDate (PGWriteMYTABLEPrep: 75: DateCol);
if JDBC_execPrepUpd( PGWriteMYTABLEPrep ) < 0;
//Custom Error Handling
endif;
JDBC_Close(pgconn);
return *on;
end-proc;
dcl-proc doPGDeleteMYTABLE export;
dcl-pi doPGDeleteMYTABLE like(success);
i#schm char(10);
i#rec char(334);
end-pi;
dcl-ds record extname('MYTABLE') end-ds;
dcl-s sqlstmt varchar(32000);
dcl-s deleteSuccess ind;
record = i#rec;
sqlstmt = 'DELETE FROM ' + %trim(i#schm) + '.MYTABLE WHERE '; //Basically the key
pgconn = doPostgreConn;
if JDBC_ExecUpd(pgconn:sqlstmt) < 0;
//Custom error handling
endif;
DoPostgreClose(pgconn);
return *on;
end-proc;
The data queue read program essentially calls DoPGDeleteMYTABLE and then DoPGWriteMYTABLE, in that order (There is no unique key, so we simply delete all of the matching records on the PostgreSQL server and then re-add them).
The problem is, while the data queue read program is running, the first loop works perfectly fine, and then fails. The order goes like this:
Record updated
Delete any existing PG records: successful
Prepare the write statement: successful
Write any existing DB2 records to PG: successful
Record updated
Delete any existing PG records: successful
Prepare the statement: successful
Write any existing DB2 records to PG: unsuccessful
repeat 5 through 8 until data queue job is restarted
The errors I receive are not very helpful. The job log on the AS400 simply tells me
org.postgresql.util.PSQLException: This connection has been closed.
even though I can see the open connection on the PostgreSQL server, and closing it from RPGLE does still work.
The JDBC job log does not tell me any information around the time the write happens. It just says that the prepare was successful, and then nothing.
Version information:
IBM OS 7.4
PostgreSQL 13.7 on x86_64-redhat-linux-gnu, compiled by gcc (GCC) 11.2.1 20220127 (Red Hat 11.2.1-9), 64-bit
PostgreSQL JDBC Driver postgresql-42.2.19
RPGLE is utilizing Scott Klement's JDBCR4
Nothing I have found online has yet to help with the issue. If there is anything else I can provide or try in order to get more information, please let me know.

I don't see anything that jumps out in the code you've posted, but given that it works the first time an fails the second, I'd guess something is reset (or not reset) between loops.
Personally, I'd recommend opening the connection once outside the DELETE/WRITE procs; but I don't think it's a fix.
The "connection closed" is interesting...might be worthwhile to run a comm trace to see if in fact the connection is being closed and if so from what side.
Note, while I love RPG, I'm not a fan of calling Java from RPG. I did some benchmarking long, long ago and it was much faster to have a small java app handle JDBC rather than using it from RPG.
You might also consider an Open Source alternative to calling Java directly from RPG.
AppServer4RPG
Application Server to make Java Components available for IBM i RPG programs, runs on IBM i or any other Java platform. Packaged with ArdGate to access any JDBC database using all native SQL interfaces from IBM i.
ArdGate basically registers itself as a DRDA Application Requester Driver (ARD) and allow you to talk to any JDBC database like you would any other remote DRDA (aka Db2) database.
Which means, you could read/write to PostgreSQL from the green screen STRSQL.

I finally got it figured out. It was a dumb thing that I didn't realize I needed to do - turns out you have to free the prepared statement after using it the first time.
Using JDBCR4, you just call (using my example)
JDBC_FreePrepStmt(PGWriteMYTABLEPrep);
Which looks like this, if anybody needs info that doesn't use JDBCR4:
*+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
* JDBC_FreePrepStmt(): Free prepared statement
*
* prep = (input) Prepared Statement to Free
*+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
P JDBC_FreePrepStmt...
P B export
D JDBC_FreePrepStmt...
D PI
D prep like(PreparedStatement)
/free
stmt_close(prep);
DeleteLocalRef(JNIENV_P: prep);
prep = *NULL;
/end-free
P E
In the end, a very poorly worded error, with a very simple solution.

Related

Encoding problem when migrating a legacy tool to a new Windows version with OleDB

We have an odd problem after migrating an old application (>15 yo) written in c# to a new Windows Server.
The application uses OleDB to connect to the database which is an Informix database. This database has a table with texts in multiple languages. The application running in a Windows 2003 server works just fine, however in the new Windows 2016 it raises the error:
"The data value could not be converted for reasons other than sign mismatch or data overflow. For example, the data was corrupted in the data store but the row was still retrievable."
After some investigation we have found the problem to be in a string that has some unicode characters in it.
This is the part of the text that is generating the problem (only part of the text to illustrate the problem:
"17"-Leichtmetallräder ...... Ziffern - Schaltknauf"
This is a German text and seems ok, the problem is actually with the "-". Looking at the db record in Hex, the first "-" is coded as "3F", however the second dash is coded as "C296", which corresponds to U+0096 (a dash in unicode)
The settings for the DB is en_US.819 (which corresponds to ISO-8859-1 to support for all languages that need to be supported).
Now, the problem is that when running the program in Windows 2003 the result is written in a file correctly like:
"17"-Leichtmetallräder ...... Ziffern - Schaltknauf"
However in Windows 2016 the exception above is raised and nothing gets written.
I worked on some code changes, the first thing I did was to change OleDB for Odbc connection and the exception dissapeared, however the text in the output is incorrect:
"17"-Leichtmetallräder ...... Ziffern ? Schaltknauf"
Notice how the same code with odbc connection is unable to understand the unicode dash.
This is the OleDB code that works in Windows 2003:
OleDbConnection ConnOleDbIDD = new OleDbConnection("Provider=Ifxoledbc.2;Data Source=db;INFORMIXSERVER=localhost;IFMX_UNDOC_B168163=1;");
string sConnectTemplateDB = "Data Source=SQLServerDB;Initial Catalog=DB1; Connect Timeout = 28800; Integrated Security=True";
ConnOleDbIDD.Open();
sExportSQL = "SELECT * From MyTable";
OleDbCommand cmdIDD = new OleDbCommand(sExportSQL, ConnOleDbIDD);
cmdIDD.CommandTimeout = 28800;
SqlDataAdapter da;
ConnSchemaIDD = new SqlConnection (sConnectTemplateDB);
ConnSchemaIDD.Open();
SqlCommand cmdSQLServerTemplate = new SqlCommand(sExportSQL.Replace("TRIM","LTRIM"), ConnSchemaIDD);
cmdSQLServerTemplate.CommandTimeout = 28800;
da = new SqlDataAdapter(cmdSQLServerTemplate);
OleDbDataReader dr;
DataSet ds = new DataSet();
da.MissingSchemaAction = MissingSchemaAction.AddWithKey;
da.Fill(ds, sSourceTable);
DataTable dt = ds.Tables[sSourceTable];
dr = cmdIDD.ExecuteReader()
iEnCodingFrom = 1252;
iEnCodingTo = 1252;
while (dr.Read())
{
sValue = "";
sCurrentValue = "";
bDelimiterPosition = false;
foreach (DataColumn cCol in dt.Columns)
{
object oval = dr.GetValue(dr.GetOrdinal(cCol.ColumnName));
string val = Convert.ToString(dr[cCol.ColumnName]);
sCurrentValue = System.Text.Encoding.GetEncoding(iEnCodingTo).GetString(System.Text.Encoding.Convert(System.Text.Encoding.GetEncoding(iEnCodingFrom), System.Text.Encoding.GetEncoding(iEnCodingTo), System.Text.Encoding.GetEncoding(iEnCodingFrom).GetBytes(val)));
if (bDelimiterPosition == true)
{
sValue = sValue + sDelimiter + sCurrentValue.Trim();
}
else
{
sValue = sValue + sCurrentValue.Trim();
}
bDelimiterPosition = true;
}
w.WriteLine(sValue);
w.Flush();
}
dr.Close();
Assume for this example that "Mytable" has 2 columns, the first is an integer ID and the second is a char(3100).
As you see the code does some weird things like getting the column description from an schema of the table in a SQLServer database, and a conversion of the db output from CP1252 to CP1252. I am not sure why it was coded that way.
My workaround for this problem has been doing these changes to the code (using odbc connection instead of oledb):
iEnCodingFrom = 28591;
...
sCurrentValue = Encoding.GetEncoding(iEnCodingTo).GetString(Encoding.GetEncoding(iEnCodingFrom).GetBytes(val.ToCharArray()));
...
So changing the connection to an ODBC conection to the informix DB which prevents the exception to be raised, and doing a conversion from codepage 28591 (8859-1) to 1252 (CP1252) produces in Windows 2016 the same result as the old code in Windows 2013.
So I have a workaround and can use it, however I would like to understand why this happens, why can't I keep using OleDB and if there is a way I can make it work in a new Windows environment (fails also in windows 10) without having to change the code.
Any help would be greatly appreciated.
Thank you
Thanks #LuísMarques and #jsagrera This is exactly the explanation I was looking for, so now I can understand the problem. In the article it is said that:
"Since CSDK version 2.80, the ODBC driver is Unicode enabled, this means all the data the driver handles has to be in Unicode format. This means that a extra conversion has to be done".
The version of the csdk in the old server is 2.71. The version in the new server is 4.10.
Now, the "UNDOC" was there for that reason, the DB was created using en_us.819 but that was ignored with the "undoc" variable for my client app, which assumes the data comes in CP1252 and print it out in CP1252, without any internal conversion the program worked.
But the data in the DB is corrupted anyway. After upgrading the driver that internal conversion being made produces the error.
I still can work around it, I don't use the "UNDOC" in the ODBC connection, then I get the stream of bytes from the DB and do a conversion from 8859-1 to CP1252 in my C# code. That way I get exactly the same output as in the old server.
However this is not a correct solution but a mitigation of the problem, and the final solution will be to change the DB to UTF8 to avoid any more problems. And that's what we will finally do.
Thank you #jsagrera I'd like to mark your answer as the correct one. I'm new to the platform so I don't know well how it works. If you would post your comment as an answer I'd gladly upvote it of mark it as the correct if possible.

How to connect to DB2 from Lotus Notes Client via LotusScript?

Just simple using JDBC driver in Java agent works fine. Now I need to connect DB2 from LotusScript. There are many articles like those:
http://www.proudprogrammer.no/web/ppblog.nsf/d6plinks/GANI-9DFMRB
https://openntf.org/XSnippets.nsf/snippet.xsp?id=db2-run-from-lotusscript-into-notes-form
but they use ODBC connection or something else. Anyway I don't see where I can define DB2 host and port in my LotusScript agent. Users won't be able to configure ODBC connection on each workstation. I need some Domino native method to connect to DB2. Or where do I define DB2 host/IP and port in this example:
https://openntf.org/XSnippets.nsf/snippet.xsp?id=db2-run-from-lotusscript-into-notes-form
You could use LSXODBC library but that is deprecated so you probably shouldn't.
The current supported method is to use the LSXLC library but be warned that it provides a very OO-centric approach to sending/consuming data but it is very quick and if you use it as designed, can make moving data from one data provider (say Notes) to another (say DB2) somewhat easy.
If you want to stick with standard SQL strings you can still do that with LSXLC with the "execute" method off of the LSConnection object.
As far as connecting to it goes you just need to make sure the appropriate driver is installed on the machine and then use the appropriate connection parameter in the when creating a new LSConnect object (e.g., ODBC2 for ODBC, DB2 for the CLI DB2 driver, OLEDB for an SQL OLE driver, etc).
If you stick with ODBC or OLEDB you can control the connection string via code. If you use the CLI DB2 driver (which is very, very fast) you need to configure the connection on each machine the driver is installed on.
All this is documented in the Designer help but it is, in my opinion, not organized in the best fashion. But it is all there.
So, some example code that has been largely copied from some code I have sitting around and is not tested is:
Option Declare
UseLSX "*lsxlc"
Sub Initialize
Dim LCSession As LCSession
Dim lcRDBMS As LCConnection
dim lcFieldList as new LCFieldList()
dim lcField_FirstName as LCField
dim lcField_LastName as LCField
dim strFirstName as string
dim strLastName as string
dim strConnectionType as string
' Hard-coding this here just for this example
' I think you will either want an ODBC (odbc2) or a CLI DB2 (db2) connection
strConnectionType = "odbc2"
Set lcRDBMS = New LCConnection (strConnectionType)
' Set some standard properties on the LCConnection object
lcRDBMS.Userid="<userid>"
lcRDBMS.Password="<password>"
lcRDBMS.MapByName=True
' Properties and property values that are different
' depending on the connection type
select case strConnectionType
case "odbc2" :
' Use the DSN name as configured in the ODBC Control Panel (if on Windows)
lcRDMBS.Database = "<SYSTEMDSN>"
case "oledb" :
lcRDBMS.Server = "<myserver.company.net>"
lcRDBMS.Provider = "sqloledb"
lcRDBMS.Database = "<my_database_name>"
' Not sure this actually changes anything or is even setting the correct property
' But the intent is to make sure the connection to the server is encrypted
lcRDBMS.INIT_ProviderString = "ENCRYPT=TRUE"
case "db2" :
' I am afraid I have lost the connection properties we used to use
' to form up a DB2 CLI connection so the following is just a best guess
' But if you are not going to be using the advance features of LSX to
' connect to DB2 you might as well just a standard ODBC driver/connection
lcRDBMS.Database = "<connection_name>"
End Select
Call lcRDBMS.Connect()
' This call returns a status code and populate the lcFieldList object with our results
lngQueryStatus = LcRDBMS.Execute("<Select FirstName, LastName from SCHEMA.Table WHERE blah>", lcFieldList)
If lngQueryStatus <> 0 Then
If lcFieldList_Destination.Recordcount > 0 Then
' Get our fields out of the lcFieldList object before going into the loop.
' Much more performant
Set lcField_FirstName = lcFieldList.Lookup("FirstName")
Set lcField_LastName = lcFieldList.Lookup("LastName")
While (lcConn.Fetch(lcFieldList) > 0 )
strFirstName = lcField_FirstName.Text(0)
strLastName = lcField_LastName.Text(0)
' Do something here with values
Wend
End If
End If
End Sub

Retrieve data from PostgreSQL Database using ADO.Net "System.Data.Odbc" (VB.Net)

Though I have been using SQL Server, Oracle from last decade, I have been asked to
do some research on PostgreSQL and after some initial investigation it is evident that I am now stuck on retrieving data from the PostgreSQL database using Function.
Using following piece of code to retrieve the data and getting error
('ERROR [26000] ERROR: prepared statement "mytabletest" does not exist;
'Error while executing the query)
Code Snippets
Dim oDBCommand As DbCommand = GetDBCommand(oConnectionType, "mytabletest", CommandType.StoredProcedure)
Dim dstResults As DataSet = GetDataSet(ConnectionTypes.ODBC, oDBCommand)
Public Function GetDataReader(dbType As ConnectionTypes, command As DbCommand) As DbDataReader
Try
Dim oConnection As DbConnection = GetDBConnection(dbType)
Dim oDBTransaction As DbTransaction = oConnection.BeginTransaction
command.Connection = oConnection
command.Transaction = oDBTransaction
'GETTING ERROR ON FOLLOWING LINE
'ERROR [26000] ERROR: prepared statement "mytabletest" does not exist;
'Error while executing the query
return command.ExecuteReader()
Catch ex As Exception
Throw ex
Finally
End Try
Return Nothing
End Function
Environement I am currently working on is following:-
32 Bit Machine.
Visual Studio 2010 + SP1
ODBC Prodiver: PostgreSQL Unicode 9.01.02.00
ADO.Net (System.Data.Odbc)
Please note that I am open to any suggestions i.e. if I am completely doing it wrong
OR partially etc. Please feel free to write.
In order to make it easier for you to create a same environment, please use following table/function definition.
--- Simple table to make things easier to understand. <br>
CREATE TABLE mytable
(
messagetypeid integer NOT NULL,
messagetype character varying(100) NOT NULL
)
-- Function to retrieve data. <br>
CREATE OR REPLACE FUNCTION mytabletest() <br>
RETURNS SETOF refcursor AS $$
DECLARE
ref1 refcursor;
BEGIN
OPEN ref1 FOR SELECT * FROM mytable;
RETURN NEXT ref1;
END;
$$ LANGUAGE plpgsql;
Please Note:
If I use <br>
Dim oDBCommand As DbCommand = GetDBCommand(oConnectionType, "SELECT * FROM mytable", CommandType.Text)
then system manages to retrieve information from the datbase without any issue, however, as I mentioned as soon we use "Function" it throws an exception.
During my failed efforts to search any solution from the internet someone mentioned that Table should be created with the lower case it so just for the sake of it I recreated with the lower case, however, problem persists.
I am unfamiliar with .net but I suspect you meant something more like:
GetDBCommand(oConnectionType, "SELECT myfunc()", CommandType.Text)
Or in the case of SETOF functions etc..
GetDBCommand(oConnectionType, "SELECT * FROM myfunc()", CommandType.Text)
PostgreSQL does not have 'stored procedures' per-ce. It does have functions and I believe that the client/server protocol has a method for preparing statements that can then be executed multiple times with different variables (to save on the cost of parsing the SQL), but this should be exposed via your client library.

SQLAlchemy, Psycopg2 and Postgresql COPY

It looks like Psycopg has a custom command for executing a COPY:
psycopg2 COPY using cursor.copy_from() freezes with large inputs
Is there a way to access this functionality from with SQLAlchemy?
accepted answer is correct but if you want more than just the EoghanM's comment to go on the following worked for me in COPYing a table out to CSV...
from sqlalchemy import sessionmaker, create_engine
eng = create_engine("postgresql://user:pwd#host:5432/db")
ses = sessionmaker(bind=engine)
dbcopy_f = open('/tmp/some_table_copy.csv','wb')
copy_sql = 'COPY some_table TO STDOUT WITH CSV HEADER'
fake_conn = eng.raw_connection()
fake_cur = fake_conn.cursor()
fake_cur.copy_expert(copy_sql, dbcopy_f)
The sessionmaker isn't necessary but if you're in the habit of creating the engine and the session at the same time to use raw_connection you'll need separate them (unless there is some way to access the engine through the session object that I don't know). The sql string provided to copy_expert is also not the only way to it, there is a basic copy_to function that you can use with subset of the parameters that you could past to a normal COPY TO query. Overall performance of the command seems fast for me, copying out a table of ~20000 rows.
http://initd.org/psycopg/docs/cursor.html#cursor.copy_to
http://docs.sqlalchemy.org/en/latest/core/connections.html#sqlalchemy.engine.Engine.raw_connection
If your engine is configured with a psycopg2 connection string (which is the default, so either "postgresql://..." or "postgresql+psycopg2://..."), you can create a psycopg2 cursor from an SQL Alchemy session using
cursor = session.connection().connection.cursor()
which you can use to execute
cursor.copy_from(...)
The cursor will be active in the same transaction as your session currently is. If a commit or rollback happens, any further use of the cursor with throw a psycopg2.InterfaceError, you would have to create a new one.
You can use:
def to_sql(engine, df, table, if_exists='fail', sep='\t', encoding='utf8'):
# Create Table
df[:0].to_sql(table, engine, if_exists=if_exists)
# Prepare data
output = cStringIO.StringIO()
df.to_csv(output, sep=sep, header=False, encoding=encoding)
output.seek(0)
# Insert data
connection = engine.raw_connection()
cursor = connection.cursor()
cursor.copy_from(output, table, sep=sep, null='')
connection.commit()
cursor.close()
I insert 200000 lines in 5 seconds instead of 4 minutes
It doesn't look like it.
You may have to just use psycopg2 to expose this functionality and forego the ORM capabilities. I guess I don't really see the benefit of ORM in such an operation anyway since it's a straight bulk insert and dealing with individual objects a la an ORM would not really make a whole lot of sense.
If you're starting from SQLAlchemy, you need to first get to the connection engine (also known by the property name bind on some SQLAlchemy objects):
engine = create_engine('postgresql+psycopg2://myuser:password#localhost/mydb')
# or
engine = session.engine
# or any other way you know to get to the engine
From the engine you can isolate a psycopg2 connection:
# get a psycopg2 connection
connection = engine.connect().connection
# get a cursor on that connection
cursor = connection.cursor()
Here are some templates for the COPY statement to use with cursor.copy_expert(), a more complete and flexible option than copy_from() or copy_to() as it is indicated here: https://www.psycopg.org/docs/cursor.html#cursor.copy_expert.
# to dump to a file
dump_to = """
COPY mytable
TO STDOUT
WITH (
FORMAT CSV,
DELIMITER ',',
HEADER
);
"""
# to copy from a file:
copy_from = """
COPY mytable
FROM STDIN
WITH (
FORMAT CSV,
DELIMITER ',',
HEADER
);
"""
Check out what the options above mean and others that may be of interest to your specific situation https://www.postgresql.org/docs/current/static/sql-copy.html.
IMPORTANT NOTE: The link to the documentation of cursor.copy_expert() indicates to use STDOUT to write out to a file and STDIN to copy from a file. But if you look at the syntax on the PostgreSQL manual, you'll notice that you can also specify the file to write to or from directly in the COPY statement. Don't do that, you're likely just wasting your time if you're not running as root (who runs Python as root during development?) Just do what's indicated in the psycopg2's docs and specify STDIN or STDOUT in your statement with cursor.copy_expert(), it should be fine.
# running the copy statement
with open('/path/to/your/data/file.csv') as f:
cursor.copy_expert(copy_from, file=f)
# don't forget to commit the changes.
connection.commit()
You don't need to drop down to psycopg2, use raw_connection nor a cursor.
Just execute the sql as usual, you can even use bind parameters with text():
engine.execute(text('''copy some_table from :csv
delimiter ',' csv'''
).execution_options(autocommit=True),
csv='/tmp/a.csv')
You can drop the execution_options(autocommit=True) if this PR will be accepted

Unable to execute an oracle update statement within perl

I have a problem in a perl script that I'm writing. When I run the script it hangs after prepare(). I've tried to run the update statement from SQL Developer and it works fine.
I've also tried to print out all parameters and they are correct.
What am I missing here?
my $upd = 'update ngs.pp_subscr_data set address=?, city=?, postalcode=?, kennitala=?, email=?, firstname=?, lastname=?, last_upd=systimestamp where snb=?';
my $s = $dbh->prepare ($upd) || exitError(-9802, 'Couldn\'t prepare update statement.');
$s->execute($addr, $city, $pcode, $ktala, $email, $fname, $lname, $snb) || exitError(-9803, 'Couldn\'t execute statement: ' . $s->errstr);
Thanks.
First, what version of Oracle?
Ok, I see couple of problems with your description. When you say "hang", is it really a hang? Could it be spinning?
Also, second, you say "... it hangs after prepare()". Does that mean it hangs after you call prepare(), or after it returns from prepare()?
Is it hanging in the database, and your client program is waiting for a database call to complete?
You need to run the program, then look at V$SESSION, identify the SID that corresponds to the database session of your program, and see what it's doing. Look at the EVENT column in V$SESSION. Also, look at the STATUS column to tell if the session is currently in a database call (ACTIVE), or waiting for the client program to call the database (INACTIVE).
Report some information back, and I may be able to provide further insight.
Hope that helps.