Updating Firebird blob with hex binary string through isql utility - firebird

I am trying to update Firebird 2.5 database through isql. One of the fields in target database is BLOB SUB_TYPE -13 (custom blob type). I store gzipped textual data on that field. In my sql script I use hexadecimal notation for binary data, for example
x'789CBD5A6D6FE2B816FE7EA5FB1FBCB3D2EC8CD45B4842209416297D9919545AAA02339A5D8D90212E587901256176995FBFC776126C9240E8CBFDD012C7F6F1B17D9EE71C1FE7FCB76F0FA3E5D28B269F96A18F63F4F5C2D4AC5AF7BFFF39FFEC2DA7D87BC0739254FDBD9A45D1C53B0F074E34C32B7256EFACE027FCFB4CD3DAF5A6282CCEB4A665581D1F87F3F0ACA5D7F993973DC5D9D3347B9A2C0876C8B6F5E469B98C93B2439E623C8DE2E52AABA6814FC37019C2F8EF6ADDF3610C0AE1D0F9B40C321DAF163864E5B33FEC9062EF8F4EFA62487F1150B7CEBBC22483B53F2561146F3C1275CF6B4A11AAC5D38290B87BBEE20514609F5CBCBB674BE2BD434178A1C1A05075F1D75FB647E7814F60D87AA71738F0D007EDB3C2239D2F5869B8C233623FB119EAA0489F06E40BE1759A661668FAE307575519FF9A3CE1B517A3071CE27988570BC4FA707DF44C9FA4634D9EC52C5275CE56EAFDEF46BBF395840E0E307F5635B14011F662C864DDE1C83D4B74E52F06F76742CF73D82C50721649435C2EFFF9469D182CA3DE308CCEE53284CDFEE4E179C43BA53AE9399DB28185E49089D525B1C5728CA3E7068B5C71720E53C128514194D2A976B6D6A077AE06FD6FBDEBD1976409B6CA36AA2ACBFE220226D2D40F6E4C91EA333E5AADFBFE775D6B74C280B8483BAFC50E379038643F4223B3D23698F2EEDED12031E046D32CD8DF672E9B56BE6CCD375F364D5AB666AD3BB26F4168BDD1794023BB7F73290A850BD8AAB480ADE2056C35ACA2151492AD67A156B25CABDA16142C7B066BAB08D6AD66B368DBC3FD634AD36E374C699EED2AF394A6D53E725AA65EB7243B930D4BABFF5F2D4BABD7BADAFFD2B12BB3F2DEB125E95AAD8B52E1797A7DA170BDD6B53D3CC3AEC7C0607682E407FD24E82788C0C8C5C13AA028267E44BD198D6840452D71A9C70BD8119D4E50B4DC6C0B3F097879E452DFA32E029F0C4E2E4AEB46A757A72555585607846C703254364E80111060A30961056C3D28B0C12B51C3BA4295D5EA24DDA7387071D671E361B420115EA129F5E6D423216533D990D023D0C2D43AD467250A9EDE65E38524A2BB1A6D30089D250518CCD43B9E4B1C8296F13A5C6F966CCD46E27DE8D20DE17B02D573A1730411039911A1253D30764A4DC9EE57768AF2EE4B10D32AB2F7963AAC04D1A9219EA51A3D13B3464BA608ADB2E32C335FF085E75302C1241AE1F945436F221E2641680701969065F7ED2BFBB6DF9BD8D793E1E03BFCFF7A73DBE3E5C79B614F8CF12E5364B4819858EBDC2D1D029C928A695410D3E8745820E747671044EE6FFDE3074C02CD16173A8BED0E343E2781934D0FFA18221E549C966069693DF7D0348BC0A53DC8870A07232D5D75489AF98AD45D3928781E554314A067549D77F2479A5F0BA81926096036B48EB706A23CC09B8554793C07C560674F091982A68CCE40BCE09E72DA7D1976552E68E5B8E0782BC8C82095A840B99587F2E5E0F1AA3F7E21900F099161BCB7AD0AE2BD4D2508B70E40B85505C2A66E4A31F9ABE2EF79E16965FC59B5AE91E1AF5288B80F7F6DC0DF08BB7485985775C98600EC4201A3042A588033890F7CF12B5A00587CCAD121B5E0DD19AE4ECAC14A5DB0A6788E1759EBD784565B825672A8AE1CCF660BD556295A92F02C70B677C069E5C179D77BB48760FC8FE3BBFE78C890702C32F74A906159DE50C564793B0990D60140966FA604484D37942CC8F1A9199EBE90774CDB19B7028C4B9332958F0DCF02B50E8BDD4841AD1F9FB95141ADC35E207B37920FC4498042F81C908079551AE038ADCBE26DB4DABA3B16EDC7EB38852803EE13A6BFD8414238530C0AF15760D63E75B72DE16480A8E0080F8E18C02A539C14531A504E1973317CD049240B5D131DD50303D30F6641444B75860B1C80E70635E82266BEDF5D87C901E397880760DED8232B441CEA39343D4224F3A2E70EFD5966337252B3C04AD2755730DECE63FC93DDFB539BD857BDDBBE7D6757C575AE978C65B552C5AF5A2761B69D6116C92A1BF56295F567A9ACEF5359DFA3B25EA032A856ACB29657F97AF0B5F7E7044E05F77A557DD52EB2B2528DAAA95421A9A9496C08F624B1E22BB8366662AF1B35EE1AADA1E7977364DFF61E26A3F1C87EEC4D3E0D1EEFC6FDAACB5ADC555EDE8216EA32173490965BDFEF7CF4F203BBEC7C0CABFE46E1A0FEBC137A65CF01E77733F31CC79F44773C87C9C34139FC4BD259210AC922A0890F592D815E81E239417332F513B6474B8F9FA6C294C819DF4E19F57201AC294F3805045A3A6BC1C06BA92EE1EB34A2E45E61267E9118022F844A11B89FDC390ECE6E92325C5B3F4D31D10D896915F910D6CE89689189E21E478A6095449DA23E6B4E36AADF6101701032B1AC0D34A7CC59B96E2A8F9F77A76BFE0EA3082F2020E72BE926E3EF4B1356CB80BD06FB9839F6796158CC242AEC63E4D9A7F73018DDDC4E44287A7F3319F4AFC79FC75509A8B4B7CC41C58D541A2A6E23319151E29F1AA5531ADA5F7A97BDC9F86A7C7F35E6A5E191D32A965030B5828685D32B68274DB171806CCB7953225BCB7CABA3B7FEB6A92FBD59EB3633AE7D69EA4B6FA547EF130897812DF466C7C709C1A665B41034E2262426803D25DE9CC5D4C94B16D1D39833D4B62397231A32D60E81767C0CA7EE3454DED247511FC6363C8CA6276A068C519F48C3FF02327D4D6EC9E7C3F4F24CCE016E2934CE4A79A146FDCD8CF36DF342BA55EBB632E37C695E48677921DB613ED04ADCE82235010F4E642E0DF136D1831106FB8B451A89FBFBD7348C76DE30764FF6473A9D9D5C8C61E619FA76307AB44755D9386B2D33AF78A9B2AC782731AA59E2340AAE6106D737F77737D7BDCFBD0984C4D7BDC7EB9BAAFA157795952D68A16A5ED0409AC6816B15BD520AC8B4DE0A7BC6F1E9BF63B067D46B5D2BC59EF1BC8F8A2469ECBA7A447E3A14253165440212A30F0F4B8F8AA814E249E287E000D6202624A1687182925AF7234AF99BF1F5CEF5F096BE4F783F08CD1DBCC1E9CD8893DEAC7AE04FA6CB70B6660A4C5F17D446FE26F485E75826514150C9EDC7F87E32BCB9BCB93CEAB643EEB47BBB91D5E56F33B22A0929076E2FAA21C5A82B5F141D1D83EF41CADB263A0D589B76869497263A8D2CD199FBF84238A3CDD2DB7A29763CE40942769108262DC012A75940D51A8D635D8C6ADEF934CDD10277CC7B374D53707F201225DF07FDCA8723A5473E29C32A8A7231ECBD64D2A5F9FF1AFF5C6A7B0B63E47321EAAEB2FC2B5BABB25B781EDBA94935053D7B3B1EF5F9DD761B5A665B0D1DBEAF5D96F78638FC43E3F423C4CB410C14BAC1BF0A12DBBE74CADF86DEA9B1FA292927669B25A7E108EF9FEED8CF917A379A603EF24784E5E14F419A3AC9B9D98FBD2F958972A74F418E8F571526F7788D6452DBF474414EBEC21793A5936D1424B87BFDDE905F35F31BE7CA27F0DD5ECAA95BA9DC39692B75DB4937EAA599E31C9EF65C54C86BA4D5F586BA161FDEF32F3667A7483C3C9DA2DB53649D98D6C774D07F01CA919662'
Unfortunately, isql fails with message
Statement failed, SQLSTATE = HY000
filter not found to convert type 1 to type -13
I have tried to cast value as
cast(x'789CBD5A6D6FE2B816FE7EA5FB1FBCB3D...' as blob sub_type -13)
still same error

The workaround to this is to use:
cast(x'...' as blob sub_type binary)
As I mentioned in the comments, you could also do
cast(cast(x'...' as blob sub_type binary) as blob sub_type -13)
But as the second cast is not necessary, I would recommend using the shorter form.
The problem is that for some reason the x'...' literal is coerced to a blob sub_type text instead of the - in my opinion more logical - blob sub_type binary. There is an implicit conversion from sub_type binary to other user-defined sub-types, but not from blob sub_type text), so - without defining a FILTER to do the conversion for you - you need to convert to blob sub_type binary first.
Sub-types binary and text are system-defined aliases for sub-types 0 and 1. You can also add aliases for user-defined blobs in the table RDB$TYPES and use those from your statements instead of the numeric sub-types.
I reported a ticket in the Firebird tracker: CORE-6389 (which is fixed in Firebird 4).

Related

How to write array[] bytea using libpq PQexecParams?

I need help with write array[] bytea using libpq PQexecParams
I'v done a simple version where i'm write a single binary data in a single bytea arg using PQexecParams like this solution Insert Binary Large Object (BLOB) in PostgreSQL using libpq from remote machine
But I have problems with write array[] bytea like this
select * func(array[3, 4], array[bytea, bytea])
Unless you want to read the PostgreSQL source to figure out the binary format for arrays, you will have to use the text format, e.g.
{\\xDEADBEEF,\\x00010203}
The binary format for arrays is defined in array_send in src/backend/utils/adt/arrayfuncs.c. Have a look at the somments and definitions in src/include/utils/array.h as well. Consider also that all integers are sent in “network bit order”.
One way you can examine the binary output format and saving yourself a lot of trouble experimenting is to use binary copy, e.g.
COPY (SELECT ARRAY['\xDEADBEEF'::bytea,'\x00010203'::bytea])
TO '/tmp/file' (FORMAT 'binary');

PostgreSQL - Converting Binary data to Varchar

We are working towards migration of databases from MSSQL to PostgreSQL database. During this process we came across a situation where a table contains password field which is of NVARCHAR type and this field value got converted from VARBINARY type and stored as NVARCHAR type.
For example: if I execute
SELECT HASHBYTES('SHA1','Password')`
then it returns 0x8BE3C943B1609FFFBFC51AAD666D0A04ADF83C9D and in turn if this value is converted into NVARCHAR then it is returning a text in the format "䏉悱゚얿괚浦Њ鴼"
As we know that PostgreSQL doesn't support VARBINARY so we have used BYTEA instead and it is returning binary data. But when we try to convert this binary data into VARCHAR type it is returning hex format
For example: if the same statement is executed in PostgreSQL
SELECT ENCODE(DIGEST('Password','SHA1'),'hex')
then it returns
8be3c943b1609fffbfc51aad666d0a04adf83c9d.
When we try to convert this encoded text into VARCHAR type it is returning the same result as 8be3c943b1609fffbfc51aad666d0a04adf83c9d
Is it possible to get the same result what we retrieved from MSSQL server? As these are related to password fields we are not intended to change the values. Please suggest on what needs to be done
It sounds like you're taking a byte array containing a cryptographic hash and you want to convert it to a string to do a string comparison. This is a strange way to do hash comparisons but it might be possible depending on which encoding you were using on the MSSQL side.
If you have a byte array that can be converted to string in the encoding you're using (e.g. doesn't contain any invalid code points or sequences for that encoding) you can convert the byte array to string as follows:
SELECT CONVERT_FROM(DIGEST('Password','SHA1'), 'latin1') AS hash_string;
hash_string
-----------------------------
\u008BãÉC±`\u009Fÿ¿Å\x1A­fm+
\x04­ø<\u009D
If you're using Unicode this approach won't work at all since random binary arrays can't be converted to Unicode because there are certain sequences that are always invalid. You'll get an error like follows:
# SELECT CONVERT_FROM(DIGEST('Password','SHA1'), 'utf-8');
ERROR: invalid byte sequence for encoding "UTF8": 0x8b
Here's a list of valid string encodings in PostgreSQL. Find out which encoding you're using on the MSSQL side and try to match it to PostgreSQL. If you can I'd recommend changing your business logic to compare byte arrays directly since this will be less error prone and should be significantly faster.
then it returns 0x8BE3C943B1609FFFBFC51AAD666D0A04ADF83C9D and in turn
if this value is converted into NVARCHAR then it is returning a text
in the format "䏉悱゚얿괚浦Њ鴼"
Based on that, MSSQL interprets these bytes as a text encoded in UTF-16LE.
With PostgreSQL and using only built-in functions, you cannot obtain that result because PostgreSQL doesn't use or support UTF-16 at all, for anything.
It also doesn't support nul bytes in strings, and there are nul bytes in UTF-16.
This Q/A: UTF16 hex to text suggests several solutions.
Changing your business logic not to depend on UTF-16 would be your best long-term option, though. The hexadecimal representation, for instance, is simpler and much more portable.

BLOB to String conversion - DB2

A table in DB2 contains BLOB data. I need to convert it into String so that it can be viewed in a readable format. I tried options like
getting blob object and converting to byte array
string buffer reader
sqoop import using --map-column-java and --map-column-hive options.
After these conversions also i am not able to view the data in readable format. Its in an unreadable format like 1f8b0000..
Please suggest a solution on how to handle this scenario.
I think you need to look at the CAST function.
SELECT CAST(BLOB_VAR as VARCHAR(SIZE) CCSID UNICODE) as CHAR_FLD
Also, be advised that max value of SIZE is 32K.
Let me know if you tried this.
1f8b0000 indicates the data in gzip form and therefore you have to unzip it.

BizTalk and DB2 CLOB

Has anyone had experience dealing with DB2 stored procedures which take a CLOB input parameter and calling that stored procedure from BizTalk?
I've tried changing the schema type to string, base64binary, hexbinary, byte, but no matter what I get this error:
Error details: The parameter value for parameter 1 could not be converted to a native data type. Parameter Name: P_EML_BODY, Data Type: Long strings of input text<br> More long strings of input text <br>More long strings of input text, Value : CharForBit
It might be the way the parameters are being created and in what order. Take a look here. Are any of them null or empty?

Write query for inserting varbinary value in PostgreSQL

What is the syntax in PostgreSQL for inserting varbinary values?
SQL Server's syntax using a constant like 0xFFFF, it didn't work.
Given there's no "varbinary" data type in Postgres I believe you mean "bytea". Take a look at the docs about the way to specify "bytea" literals.
Depending on the language and the bindings you use there could be more sophisticated ways for transferring binary data - you could find a .Net/C#/Npgsql example here (under "Working with binary data and bytea datatype").