In DB2 , how do we write a query to fetch value from clob datatype column
Not sure if this will help with your situation, but we ran into a similar situation at my company. For us, we were able to read the values out as a normal string using C#/.NET and the IBM iSeries data provider. The data we wanted to fetch from the CLOB was just simple text, which allowed this process to work.
For sql/pl, you can select clob data from database same other type, but if you use jdbc I should byte[] for Clob data.
Related
I use Spring boot with spring-jpa, database postgresql.
I'd like to call Stored Procedure via EntityManager.createStoredProcedureQuery.
One of stored procedure parameter is postgresql xml. If I transfer String as parameter I get an error. Could you explain me, please, how can I convert String to SQLXML in this situation?
I am unable to upload the entire clob data using db2 load. Only the data till a certain length is put into the column(CLOB) of the table.
I have a BLOB field in a table that I am selecting. This field data consists only of JSON data.
If I do the following:
Select CAST(JSONBLOB as VARCHAR(2000)) from MyTable
--> this returns the value in VARCHAR FOR BIT DATA format.
I just want it as a standard string or varcher - not in bit format.
That is because I need to use JSON2BSON function to convert the JSON to BSON. JSON2BSON accepts a string but it will not accept a VarChar for BIT DATA type...
This conversation should be easy.
I am able to do the select as a VARCHAR FOR BIT DATA.. Manually COPY it using the UI. Paste it into a select literal and convert that to BSON. I need to migrate a bunch of data in this BLOB from JSON to BSON, and doing it manually won't be fast. I just want to explain how simple of a use case this should be.
What is the select command to essentially get this to work:
Select JSON2BSON(CAST(JSONBLOB as VARCHAR(2000))) from MyTable
--> Currently this fails because the CAST converts this (even though its only text characters) to VARCHAR for BIT DATA type and not standard VARCHAR.
What is the suggestion to fix this?
DB2 11 on Windows.
If the data is JSON, then the table column should be CLOB in the first place...
Having the table column a BLOB might make sense if the data is actually already BSON.
You could change the blob into a clob using the converttoclob procedure then you should be ok.
https://www.ibm.com/support/knowledgecenter/SSEPGG_11.5.0/com.ibm.db2.luw.apdv.sqlpl.doc/doc/r0055119.html
You can use this function to remove the "FOR BIT DATA" flag on a column
CREATE OR REPLACE FUNCTION DB_BINARY_TO_CHARACTER(A VARCHAR(32672 OCTETS) FOR BIT DATA)
RETURNS VARCHAR(32672 OCTETS)
NO EXTERNAL ACTION
DETERMINISTIC
BEGIN ATOMIC
RETURN A;
END
or if you are on Db2 11.5 the function SYSIBMADM.UTL_RAW.CAST_TO_VARCHAR2 will also work
Saving DataFrame to table with VARBINARY columns is throwing error:
com.microsoft.sqlserver.jdbc.SQLServerException: Column, parameter, or
variable #7: Cannot find data type BLOB
If I try to use VARBINARY in createTableColumnTypes option, I get "VARBINARY not supported".
Workaround is:
change TARGET schema to use VARCHAR.
Add .option("createTableColumnTypes", "Col1 varchar(500), Col2) varchar(500)")
While this workaround lets us go ahead with saving rest of data, actual binary data from source table (from where Data is read) is not saved correctly for these 2 columns - we see NULL data.
We are using MS SQL Server 2017 JDBC driver and Spark 2.3.2.
Any help, workaround to address this issue correctly so that we don't lose data is appreciated.
DB2 exception - DB2 SQL Error: SQLCODE=-433, SQLSTATE=22001 is thrown when inserting a Base64 CLOB data with length of 38K characters into DB2 table CLOB field defined with length of 10MB. Database insertion is done via a Stored Procedure called by a MuleSoft flow. We've been unable to find the root cause or solution to this. Has anyone seen this behaviour?