I am using Liferay IDE bundeled with Tomcat for Liferay Portal 6.1 ... Now I have some method (that I have no control on) that creates an insert statement according to some inputs and run it against the DB .. It uses internally jdbc (and oracle driver thus ... as we are dealing with Oracle DB)
This methods gives me this error :
ORA-24816: Expanded non LONG bind data supplied after actual LONG or LOB column
which after some investigation I knew that it's a bug in oracle when a CLOB column comes before a VARCHAR column in the statement ... and thus the statement needs to be tuned
The weird thing is that it the same code works fine on the integration server (jboss) !!!
I need to know the reason as well as a way to solve it.
Actually I doubt that the reason is in the oracle driver
ORA-24816: Expanded non LONG bind data supplied after actual LONG or LOB column
Cause: A Bind value of length potentially > 4000 bytes follows binding for LOB or LONG.
Action: Re-order the binds so that the LONG bind or LOB binds are all at the end of the bind list.
Here is a link to a thread in an Oracle venue discussing this issue:
http://forums.oracle.com/forums/thre...5560&tstart=15
I replaced the ojdbc6.jar by the one on the integration server and it worked fine
Related
What is the difference between pre-compile and bind for a COBOL DB2 program.
How does syntax check differ in both the processes.
If we give the wrong column name in our code, then in which process it will fail.
It seems you need to do some study in the Db2 Knowledge Centre.
A pre-compile action creates a bindfile, containing the static SQL present in the source code (i.e the sections of code with EXEC SQL statements in your COBOL), in addition to a compilable form of the source code that contains the non-SQL logic and data (your PROCEDURE DIVISION and DATA DIVISION etc).
A bind action uses both the bindfile and the database to create a package inside the database which is the executable form of the bindfile contents. The package contains sections corresponding your your EXEC SQL blocks for static SQL.
Later, when the built (i.e. compiled and linked) application executes, and wants to use the database, this will cause sections of the package to be loaded from the database catalog (or read from cache) and executed by the database manager to deliver the required actions.
As each command (precompile, vs bind) serves a different purpose, the syntax varies , and also can vary with the Db2-server platform (Z/OS , i-series, Linux/Unix/Windows) and version.
Refer to the free Db2 Knowledge Center for your version of Db2 and your Db2-server platform (separate different documentation Knowledge Center websites exist for Db2-for-Z/OS, Db2 for i-series, Db2-for-Linux/Unix/Windows ).
We are trying to load the data from one postgres table to another postgres table in the same database using informatica. And we are having the following issue -
The error message is as follows:
Message Code: WRT_8229
Message: Database errors occurred:
FnName: Execute -- [Informatica][ODBC PostgreSQL Wire Protocol driver][PostgreSQL]ERROR: VERROR; syntax error at or near "VALUESNSERT"(Position 135; File scan.l; Line 1134; Routine scanner_yyerror; ) Error in parameter 6.
FnName: Execute -- [Informatica][ODBC PostgreSQL Wire Protocol driver][PostgreSQL]Failed transaction. The current transaction rolled back. Error in parameter 6.
FnName: Execute -- [DataDirect][ODBC lib] Function sequence error
It is working fine if we are not loading one of the string column which is of 3000 bytes. Can anyone please shed some light on this issue -
Note: There are no reserved/keywords in our table structure
if you have already identified the error-causing column then, you can follow below steps to find the root cause -
1. You can check the data type of the column in informatica - if it is matching to the target in DB in terms of length and data type.
2. Make sure you import the target from database. Creating target from other process or adding column to existing target can lead to such error.
3. run in verbose mode or debug to see where exactly its causing issue. Check if its reading, transforming, and loading data properly etc.
4. remove postgres target and attach a flat file - if this works then there is issue in database table. Check for index, constraints etc. which can lead to this issue.
5. Check ODBC version as well which may have lots of limitations like data type, length handling. ODBC is also not good at generating errors so you may have to do some guesswork etc to find out.
Thanks everyone. My issue got resolved after implementing Informatica PDO.
I see how to debug queries stored as Functions in the database. But my problem is with an external QGIS plugin that connects to my Postgres 10.4 via network and does a complex query and calculations, and stores the results back into PostGIS tables:
FOR r IN c LOOP
SELECT
(1 - ST_LineLocatePoint(path.geom, ST_Intersection(r.geom, path.geom))) * ST_Length(path.geom)
INTO
station
(continues ...)
When it errors, it just returns that line number as the failing location, but no clue where it was in the loop through hundreds of features. (And any features it has processed are not stored to the output tables when it fails.) I totally don't know enough about the plugin and about SQL to hack the external query, and I suspect if it was a reasonable task the plugin author would have included more revealing debug messages.
So is there some way I could use pgAdmin4 (or anything) from the server side to watch the query process? Even being able to see if it fails the first time through the loop or later would help immensely. Knowing the loop count at failure would point me to the exact problem feature. Being able to see "station" or "r.geom" would make it even easier.
Perfectly fine if the process is miserably slow or interferes with other queries, I'm the only user on this server.
This is not actually a way to watch the RiverGIS query in action, but it is the best I have found. It extracts the failing ST_Intersects() call from the RiverGIS code and runs it under your control, where you can display any clues you want.
When you're totally mystified where the RiverGIS problem might be, run this SQL query:
SELECT
xs."XsecID" AS "XsecID",
xs."ReachID" AS "ReachID",
xs."Station" AS "Station",
xs."RiverCode" AS "RiverCode",
xs."ReachCode" AS "ReachCode",
ST_Intersection(xs.geom, riv.geom) AS "Fraction"
FROM
"<your project name>"."StreamCenterlines" AS riv,
"<your project name>"."XSCutLines" AS xs
WHERE
ST_Intersects(xs.geom, riv.geom)
ORDER BY xs."ReachID" ASC, xs."Station" DESC
Obviously replace <your project name> with the QGIS project name.
Also works for the BankLines step if you replace "StreamCenterlines" with "BankLines". Probably could be adapted to other situations where ST_Intersects() fails without a clue.
You'll get a listing with shorter geometry strings for good cross sections and double-length strings for bad ones. Probably need to widen your display column a lot to see this.
Works for me in pgAdmn4, or in QGIS3 -> Database -> DB Manager -> (click the wrench icon). You could select only bad lines, but I find the background info helpful.
We are using IBM's data provider from C# .NET 4.5 to query an i Series DB2 database. Normally this works very well, but for some queries, DB2 reports error "SQL0666 - SQL query exceeds specified time limit or storage limit".
I have tried setting the command timeout to 0, but to no effect. I have also tried to execute, in the manner explained here, the CHGQRYA command to set the QRYTIMLMT value to *NOMAX (or some other large value), but seemingly to no effect. However, if I use the same command to set the QRYSTGLMT (storage limit), it takes effect. Thus, I know that I am using the command correctly, and that it gets interpreted and executed by the database.
So, what can cause my inability to set the QRYTIMLMT value?
Also, our "DBA" has set the limit to *NOMAX on his end, and for queries not running through the .NET provider, everything works fine.
We're using IBM's Client Tools version 6r1 with service pack SI42423.
OK, so after lots of testing, I found the problem.
We're using the DeriveParameters() method to set the parameter types correctly, and if this method is called before setting CommandTimeout, the latter has no effect(!). The solution was to reverse the ordering of these statements.
I have Postgresql DB on my pc and I'm trying to connect different database application to Postgresql but before that(An research issue), for each application, I need to see all the input parameter and all the queries corresponding to those input parameter that application can do.
How?
Look in the code of every application and see what calls are being made. In addition figure out all the parameter values that can be sent based on an almost infinite combination of characters and numbers the user can select from.
Or to remain sane turn on postgresql logging and let the users do their thing and analyse what calls are being made.