PostgreSQL with ODBC: "there is no parameter $1" - postgresql

I have a insert statement:
INSERT INTO billData (
tmStart, tsDuration, eCallDir, ...
) VALUES (
$1, -- tmStart
$2, -- tsDuration
$3, -- eCallDir
...
);
I use SQLPrepare to compile it, bind parameters by SQLBindParameter, and execute it by SQLExecute.
After all of there steps, the error code 42P02 (there is no parameter $1) was returned.
BTW: I'm also using the almost same code for MS SQL Server and MySQL, and these two DB are working very well, so I believe my code is correct.
PS: the PostgreSQL is v9.1; the psqlODBC is v9.01.0100-1.
============================================================
UPDATE:
And the following error occured: 42601 (syntax error at or near ",") when I'm using '?' as the parameter placholder:
INSERT INTO billData (
tmStart, tsDuration, eCallDir, ...
) VALUES (
?, -- tmStart
?, -- tsDuration
?, -- eCallDir
...
);
============================================================
UPDATE:
According to the suggestion from j.w.r, It works after adding UseServerSidePrepare=1 option to the ODBC connection string.
A lot of thanks :-)

While I am sure prepared statements works well with PostgreSQL ODBC I think something is wrong with this statement code. At first I would remove comments to make this INSERT as simple as possible. I have just tried multiline INSERT with comments and with colon at ends and it worked well. I converted a little my test Python code:
import odbc
db = odbc.odbc('odbcalias/user/password')
c = db.cursor()
r = c.execute("""INSERT INTO billData (
tmStart, tsDuration, eCallDir, ...
) VALUES (
?, -- tmStart
?, -- tsDuration
?, -- eCallDir
...
);""", (tmStart, tsDuration, eCallDir, ))
print('records inserted: %s' % (r))
Can you try it with ActivePython 2.7 (it already has odbc module)?
If it will work then enable ODBC tracing and test this program. In odbc sql trace file should be entries like:
...
python.exe odbc 924-e8c ENTER SQLExecDirect
HSTMT 00A029A8
UCHAR * 0x00B5A480 [ -3] "INSERT INTO test_table(txt) VALUES (?)\ 0"
SDWORD -3
python.exe odbc 924-e8c EXIT SQLExecDirect with return code 0 (SQL_SUCCESS)
HSTMT 00A029A8
UCHAR * 0x00B5A480 [ -3] "INSERT INTO test_table(txt) VALUES (?)\ 0"
SDWORD -3
python.exe odbc 924-e8c ENTER SQLNumResultCols
HSTMT 00A029A8
SWORD * 0x0021FAA0
python.exe odbc 924-e8c EXIT SQLNumResultCols with return code 0 (SQL_SUCCESS)
HSTMT 00A029A8
SWORD * 0x0021FAA0 (0)
python.exe odbc 924-e8c ENTER SQLRowCount
HSTMT 00A029A8
SQLLEN * 0x0021FBDC
python.exe odbc 924-e8c EXIT SQLRowCount with return code 0 (SQL_SUCCESS)
HSTMT 00A029A8
SQLLEN * 0x0021FBDC (1)
Then make similar trace for your application and try to locate what is different.

============================================================
UPDATE:
According to the suggestion from j.w.r, It works after adding UseServerSidePrepare=1 option to the ODBC connection string.
A lot of thanks :-)
Note: the ByteaAsLongVarBinary=1 option should also be added if you want to parameterize a bytea field with SQLBindParameter.

Related

Converting function instr from Oracle to PostgreSQL (sql only)

I am working on converting something from Oracle to PostgreSQL. In the Oracle file there is a function:
instr(string,substring,starting point,nth location)
Example:
instr(text, '$', 1, 3)
In PostgreSQL this does not exist, so I looked up an equivalent function (4 parameter is important).
I found:
The function strpos(str, sub) in Postgres is equivalent of instr(str, sub) in Oracle. Tried options via split_part (it didn't work out).
I need the same result only with standard functions Postgres (not own function).
Maybe someone will offer options, even redundant in code.
This may be done in pure SQL using string_to_array.
with tab(val) as (
select 'qwe$rty$123$456$78'
union all
select 'qwe$rty$123$'
union all
select '123$456$'
union all
select '123$456'
)
select
val
/*Oracle's signature: instr(string , substring [, position [, occurrence ] ])*/
, case
when
array_length(
string_to_array(substr(val /*string*/, 1 /*position*/), '$' /*substring*/),
1
) <= 3 /*occurrence*/
then 0
else
length(array_to_string((
string_to_array(substr(val /*string*/, 1 /*position*/), '$' /*substring*/)
)[:3/*occurrence*/],
'$'/*substring*/)
) + 1
end as instr
from tab
val
instr
qwe$rty$123$456$78
12
qwe$rty$123$
12
123$456$
0
123$456
0
Postgres: fiddle
Oracle: fiddle

psycopg2 execute_values fails with > 100 rows in supabase?

Not sure if this is a Supabase issue or a psycopg2 issue honestly and would love some help debugging.
I have the following code:
args = [('HaurGlass','60000','2022-10-20T21:15:39.751Z','10130506261','ac76e8db-ace0-40df-b6fa-f470641805e9','ad43639e-f66e-49d5-8fe8-d1ce5cd26193','{}')]
statement = ('''
INSERT INTO %s (%s) VALUES %s ON CONFLICT (company_id, crm_id)
DO UPDATE SET (%s)=(%s) RETURNING crm_id, id''')
statement = cur.mogrify(statement,
(AsIs(db_table), AsIs(','.join(keys)),
AsIs("%s"), AsIs(','.join(update_keys)),
AsIs(','.join(excluded_keys))))
output = execute_values(cur, statement, args, fetch=True)
The weird thing is that if args is <=100 rows in length, this query works without any problems. As soon as I increase the length of args to 101 rows or more, my Postgres logs show:
INSERT INTO licenses (name,value,subscription_end,crm_id,company_id,csm_id,custom_data) VALUES ('HaurGlass','60000','2022-10-20T21:15:39.751Z','10130506261','ac76e8db-ace0-40df-b6fa-f470641805e9','ad43639e-f66e-49d5-8fe8-d1ce5cd26193','{}')...
which would be good, except that it's immediately followed by:
INSERT INTO licenses (name,value,subscription_end,crm_id,company_id,csm_id,custom_data) VALUES ('HaurGlass','60000','2022-10-20T21:15:39.751Z','10130506261','ac76e8db-ace0-40df-b6fa-f470641805e9',NULL,'{}'),...
I've also confirmed that the number of records in the second "NULLifying" query is exactly equal to len(args)-100.
Any idea what is going on?
OK so it turns out I was missing the page_size parameter. All I had to do was:
output = execute_values(cur, statement, args, fetch=True, page_size=len(args))

The sql works fine but with python it doesn't insert values into table

I'm trying to use python for stored procs in sql, I have tested my sql code and it works fine, but when I execute it via python, the values are not inserted into my table
Note: I don't have any errors when executing
My code below:
import psycopg2
con = psycopg2.connect(dbname='dbname'
, host='host'
, port='5439', user='username', password='password')
def executeScriptsfromFile(filename):
#Open and read the file as a single buffer
cur = con.cursor()
fd = open(filename,'r')
sqlFile = fd.read()
fd.close()
#all SQL commands(split on ';')
sqlCommands = filter(None,sqlFile.split(';'))
#Execute every command from the input file
for command in sqlCommands:
# This will skip and report errors
# For example, if the tables do not yet exist, this will skip over
# the DROP TABLE commands
try:
cur.execute(command)
con.commit()
except Exception as inst:
print("Command skipped:", inst)
cur.close()
executeScriptsfromFile('filepath.sql')
Insert comment in sql:
INSERT INTO schema.users
SELECT
UserId
,Country
,InstallDate
,LastConnectDate
FROM #Source;
Note: As I said the sql works perfectly fine when I tested it.

very large fields in As400 ISeries database

I would like to save a large XML string (possibly longer than 32K or 64K) into an AS400 file field. Either DDS or SQL files would be OK. Example of SQL file below.
CREATE TABLE MYLIB/PRODUCT
(PRODCODE DEC (5 ) NOT NULL WITH DEFAULT,
PRODDESC CHAR (30 ) NOT NULL WITH DEFAULT,
LONGDESC CLOB (70K ) ALLOCATE(1000) NOT NULL WITH DEFAULT)
We would use RPGLE to read and write to fields.
The goal is to then pull out data via ODBC connection on a client side.
AS400 character fields seem to have 32K limit, so this is not great option.
What options do I have? I have been reading up on CLOBs but there appear to be restrictions writing large strings to CLOBS and reading CLOB field remotely. Note that client is (still) on v5R4 of AS400 OS.
thanks!
Charles' answer below shows how to extract data. I would like to insert data. This code runs, but throws a '22501' SQL error.
D wLongDesc s 65531a varying
D longdesc s sqltype(CLOB:65531)
/free
//eval longdesc = *ALL'123';
eval Wlongdesc = '123';
exec SQL
INSERT INTO PRODUCT (PRODCODE, PRODDESC, LONGDESC)
VALUES (123, 'Product Description', :LongDesc );
if %subst(sqlstt:1:2) <> '00';
// an error occurred.
endif;
// get length explicitly, variables are setup by pre-processor
longdesc_len = %len(%trim(longdesc_data));
wLongDesc = %subst(longdesc_data:1:longdesc_len);
/end-free
C Eval *INLR = *on
C Return
Additional question: Is this technique suitable for storing data which I want to extract via ODBC connection later? Does ODBC read CLOB as pointer or can it pull out text?
At v5r4, RPGLE actually supports 64K character variables.
However, the DB is limited to 32K for regular char/varchar fields.
You'd need to use a CLOB for anything bigger than 32K.
If you can live with 64K (or so )
CREATE TABLE MYLIB/PRODUCT
(PRODCODE DEC (5 ) NOT NULL WITH DEFAULT,
PRODDESC CHAR (30 ) NOT NULL WITH DEFAULT,
LONGDESC CLOB (65531) ALLOCATE(1000) NOT NULL WITH DEFAULT)
You can use RPGLE SQLTYPE support
D code S 5s 0
d wLongDesc s 65531a varying
D longdesc s sqltype(CLOB:65531)
/free
exec SQL
select prodcode, longdesc
into :code, :longdesc
from mylib/product
where prodcode = :mykey;
wLongDesc = %substr(longdesc_data:1:longdesc_len);
DoSomthing(wLongDesc);
The pre-compiler will replace longdesc with a DS defined like so:
D longdesc ds
D longdesc_len 10u 0
D longdesc_data 65531a
You could simply use it directly, making sure to only use up to longdesc_len or covert it to a VARYING as I've done above.
If absolutely must handle larger than 64K...
Upgrade to a supported version of the OS (16MB variables supported)
Access the CLOB contents via an IFS file using a file reference
Option 2 is one I've never seen used....and I can't find any examples. Just saw it mentioned in this old article..
http://www.ibmsystemsmag.com/ibmi/developer/general/BLOBs,-CLOBs-and-RPG/?page=2
This example shows how to write to a CLOB field in Db2 database... with help from Charles and Mr Murphy's feedback.
* ----------------------------------------------------------------------
* Create table with CLOB:
* CREATE TABLE MYLIB/PRODUCT
* (MYDEC DEC (5 ) NOT NULL WITH DEFAULT,
* MYCHAR CHAR (30 ) NOT NULL WITH DEFAULT,
* MYCLOB CLOB (65531) ALLOCATE(1000) NOT NULL WITH DEFAULT)
* ----------------------------------------------------------------------
D PRODCODE S 5i 0
D PRODDESC S 30a
D i S 10i 0
D wLongDesc s 65531a varying
D longdesc s sqltype(CLOB:65531)
D* Note that variables longdesc_data and longdesc_len
D* get create automatocally by SQL pre-processor.
/free
eval wLongdesc = '123';
longdesc_data = wLongDesc;
longdesc_len = %len(%trim(wLongDesc));
exec SQL set option commit = *none;
exec SQL
INSERT INTO PRODUCT (MYDEC, MYCHAR, MYCLOB)
VALUES (123, 'Product Description',:longDesc);
if %subst(sqlstt:1:2)<>'00' ;
// an error occurred.
endif;
Eval *INLR = *on;
Return;
/end-free

Problems using REXX to access both Teradata output and DB2 output

I have a REXX job that needs to read from both Teradata (using BTEQ) and DB2. At present, I can get it to either read from Teradata or DB2, but not both. When I try to read from both, the Teradata one (which runs first) works fine but the DB2 read gives an error of RC(1) upon attempting to open the cursor.
Code to read from Teradata (by and large copied from http://www.teradataforum.com/teradata/20040928_131203.htm):
ADDRESS TSO "DELETE BLAH.TEMP"
"ALLOC FI(SYSPRINT) DA(BLAH.TEMP) NEW CATALOG SP(10 10) TR RELEASE",
"UNIT(SYSDA) RECFM(F B A) LRECL(133) BLKSIZE(0) REUSE"
"ATTRIB FBATTR LRECL(220)"
"ALLOC F(SYSIN) UNIT(VIO) TRACKS SPACE(10,10) USING(FBATTR)"
/* Set up BTEQ script */
QUEUE ".RUN FILE=LOGON"
QUEUE "SELECT COLUMN1 FROM TABLE1;"
/* Run BTEQ script */
"EXECIO * DISKW SYSIN (FINIS"
"CALL 'SYS3.TDP.APPLOAD(BTQMAIN)'"; bteq_rc = rc
"FREE FI(SYSPRINT SYSIN)"
/* Read and parse BTEQ output */
"EXECIO * DISKR SYSPRINT (STEM BTEQOUT. FINIS"
DO I = 1 to BTEQOUT.0
...
END
Code to read from DB2:
ADDRESS TSO "SUBCOM DSNREXX"
IF RC THEN rcDB2 = RXSUBCOM('ADD','DSNREXX','DSNREXX')
ADDRESS DSNREXX "CONNECT " subsys
sqlQuery = "SELECT COLUMN2 FROM TABLE2;"
ADDRESS DSNREXX "EXECSQL DECLARE C001 CURSOR FOR S001"
IF SQLCODE <> 0 THEN DO
SAY 'DECLARE C001 SQLCODE = ' SQLCODE
EXIT 12
END
ADDRESS DSNREXX "EXECSQL PREPARE S001 FROM :sqlQuery"
IF SQLCODE <> 0 THEN DO
SAY 'PREPARE S001 SQLCODE = ' SQLCODE SQLERROR
EXIT 12
END
ADDRESS DSNREXX "EXECSQL OPEN C001"
IF SQLCODE <> 0 THEN DO
SAY 'OPEN C001 SQLCODE = ' SQLCODE
EXIT 12
END
ADDRESS DSNREXX "EXECSQL FETCH C001 INTO :col2"
IF SQLCODE <> 0 THEN DO
SAY 'FETCH C001 SQLCODE = ' SQLCODE
EXIT 12
END
I suspect that this has something to do with my use of SYSPRINT and SYSIN. Anyone know how I can get this to work?
Thanks.
Edit
The question as stated was actually wrong. Apologies for not correcting this earlier.
What I had really done was have this:
ADDRESS TSO "SUBCOM DSNREXX"
IF RC THEN rcDB2 = RXSUBCOM('ADD','DSNREXX','DSNREXX')
ADDRESS DSNREXX "CONNECT " subsys
...followed by a small read from DB2, then followed by the code to read from Teradata, followed by more code to read from DB2. When this was changed to reading from Teradata first before having anything to do with DB2 at all, it worked.
I don't think this has anything to do with SYSPRINT or SYSIN.
I think you are getting TSO RC = 1, not SQLCODE = 1 (because there is no SQLCODE of 1.
1 is a warning, -1 is an error. You can look this up in the DB2 Application Programming
and SQL Guide
Turn on TRACE R and run it.
There are variables (shown below) that display info about the error/warning.
22 *-* ADDRESS DSNREXX "EXECSQL OPEN C1"
>>> "EXECSQL OPEN C1"
+++ RC(1) +++
23 *-* IF SQLCODE <> 0
28 *-* SAY 'SQLSTATE='sqlstate', SQLERRMC='sqlerrmc', SQLERRP='sqlerrp
SQLSTATE=00000, SQLERRMC=, SQLERRP=DSN
29 - SAY 'SQLERRD='sqlerrd.1', 'sqlerrd.2', 'sqlerrd.3', 'sqlerrd.4',',
sqlerrd.5', 'sqlerrd.6
SQLERRD=0, 0, 0, -1, 0, 0
32 - SAY 'SQLWARN='sqlwarn.0', 'sqlwarn.1', 'sqlwarn.2', 'sqlwarn.3',',
sqlwarn.4', 'sqlwarn.5', 'sqlwarn.6', 'sqlwarn.7',',
sqlwarn.8', 'sqlwarn.9', 'sqlwarn.10
SQLWARN= , N, , , , 2, , , , ,
For example, it could be that when both are used together, there is not enough memory.