Write query for inserting varbinary value in PostgreSQL - postgresql

What is the syntax in PostgreSQL for inserting varbinary values?
SQL Server's syntax using a constant like 0xFFFF, it didn't work.

Given there's no "varbinary" data type in Postgres I believe you mean "bytea". Take a look at the docs about the way to specify "bytea" literals.
Depending on the language and the bindings you use there could be more sophisticated ways for transferring binary data - you could find a .Net/C#/Npgsql example here (under "Working with binary data and bytea datatype").

Related

How to write array[] bytea using libpq PQexecParams?

I need help with write array[] bytea using libpq PQexecParams
I'v done a simple version where i'm write a single binary data in a single bytea arg using PQexecParams like this solution Insert Binary Large Object (BLOB) in PostgreSQL using libpq from remote machine
But I have problems with write array[] bytea like this
select * func(array[3, 4], array[bytea, bytea])
Unless you want to read the PostgreSQL source to figure out the binary format for arrays, you will have to use the text format, e.g.
{\\xDEADBEEF,\\x00010203}
The binary format for arrays is defined in array_send in src/backend/utils/adt/arrayfuncs.c. Have a look at the somments and definitions in src/include/utils/array.h as well. Consider also that all integers are sent in “network bit order”.
One way you can examine the binary output format and saving yourself a lot of trouble experimenting is to use binary copy, e.g.
COPY (SELECT ARRAY['\xDEADBEEF'::bytea,'\x00010203'::bytea])
TO '/tmp/file' (FORMAT 'binary');

Dealing with parsing oids in Postgres

I'm currently improving a library client for Postgresql, the library already has working communication protocol including DataRow and RowDescription.
The problem I'm facing right now is how to deal with values.
Returning plain string with array of integers for example is kind of pointless.
By my research I found that some other libraries (like for Python) either return is as unmodified string or convert primitive types including arrays.
What I mean by conversion is making Postgres DataRow raw data as Python-type value: Postgres integer is parsed as python number, Postgres booleans as python booleans, etc.
Should I make second query to get information column type and use its converters or should I leave it plain?
You could opt to get the array values in the internal format by setting the corresponding "result-column format code" in the Bind message to 1, but that is typically a bad choice, since the internal format varies from type to type and may even depend on the server's architecture.
So your best option is probably to parse the string representation of the array on the client side, including all the escape characters.
When it comes to finding the base type for an array type, there is no other option than querying pg_type like
SELECT typelem::regtype FROM pg_type WHERE oid = 1007;
typelem
---------
integer
(1 row)
You could cache these values on the client side so that you don't have to query more than once per type and database session.

Db2 for I: Cpyf *nochk emulation

In the IBM i system there's a way to copy a from a structured file to one without structure using Cpyf *nochk.
How can it be done with sql?
The answer may be "You can't", not if you are using DDL defined tables anyway. The problem is that *NOCHK just dumps data into the file like a flat file. Files defined with CRTPF, whether they have source, or are program defined, don't care about bad data until read time, so they can contain bad data. In fact you can even read bad data out of a file if you use a program definition for that file.
But, an SQL Table (one defined using DDL) cannot contain bad data. No matter how you write it, the database validates the data at write time. Even the *NOCHK option of the CPYF command cannot coerce bad data into an SQL table.
There really isn't an easy way
Closest would be to just build a big character string using CONCAT...
insert into flatfile
select mycharfld1
concat cast(myvchar as char(20))
concat digits(zonedFld3)
from mytable
That works for fixed length, varchar (if casted to char) and zoned decimal...
Packed decimal would be problematic..
I've seen user defined functions that can return the binary character string that make up a packed decimal...but it's very ugly
I question why you think you need to do this.
You can use QSYS2.QCMDEXC stored procedure to execute OS commands.
Example:
call qsys2.qcmdexc ( 'CPYF FROMFILE(QTEMP/FILE1) TOFILE(QTEMP/FILE2) MBROPT(*replace) FMTOPT(*NOCHK)' )

PostgreSQL - Converting Binary data to Varchar

We are working towards migration of databases from MSSQL to PostgreSQL database. During this process we came across a situation where a table contains password field which is of NVARCHAR type and this field value got converted from VARBINARY type and stored as NVARCHAR type.
For example: if I execute
SELECT HASHBYTES('SHA1','Password')`
then it returns 0x8BE3C943B1609FFFBFC51AAD666D0A04ADF83C9D and in turn if this value is converted into NVARCHAR then it is returning a text in the format "䏉悱゚얿괚浦Њ鴼"
As we know that PostgreSQL doesn't support VARBINARY so we have used BYTEA instead and it is returning binary data. But when we try to convert this binary data into VARCHAR type it is returning hex format
For example: if the same statement is executed in PostgreSQL
SELECT ENCODE(DIGEST('Password','SHA1'),'hex')
then it returns
8be3c943b1609fffbfc51aad666d0a04adf83c9d.
When we try to convert this encoded text into VARCHAR type it is returning the same result as 8be3c943b1609fffbfc51aad666d0a04adf83c9d
Is it possible to get the same result what we retrieved from MSSQL server? As these are related to password fields we are not intended to change the values. Please suggest on what needs to be done
It sounds like you're taking a byte array containing a cryptographic hash and you want to convert it to a string to do a string comparison. This is a strange way to do hash comparisons but it might be possible depending on which encoding you were using on the MSSQL side.
If you have a byte array that can be converted to string in the encoding you're using (e.g. doesn't contain any invalid code points or sequences for that encoding) you can convert the byte array to string as follows:
SELECT CONVERT_FROM(DIGEST('Password','SHA1'), 'latin1') AS hash_string;
hash_string
-----------------------------
\u008BãÉC±`\u009Fÿ¿Å\x1A­fm+
\x04­ø<\u009D
If you're using Unicode this approach won't work at all since random binary arrays can't be converted to Unicode because there are certain sequences that are always invalid. You'll get an error like follows:
# SELECT CONVERT_FROM(DIGEST('Password','SHA1'), 'utf-8');
ERROR: invalid byte sequence for encoding "UTF8": 0x8b
Here's a list of valid string encodings in PostgreSQL. Find out which encoding you're using on the MSSQL side and try to match it to PostgreSQL. If you can I'd recommend changing your business logic to compare byte arrays directly since this will be less error prone and should be significantly faster.
then it returns 0x8BE3C943B1609FFFBFC51AAD666D0A04ADF83C9D and in turn
if this value is converted into NVARCHAR then it is returning a text
in the format "䏉悱゚얿괚浦Њ鴼"
Based on that, MSSQL interprets these bytes as a text encoded in UTF-16LE.
With PostgreSQL and using only built-in functions, you cannot obtain that result because PostgreSQL doesn't use or support UTF-16 at all, for anything.
It also doesn't support nul bytes in strings, and there are nul bytes in UTF-16.
This Q/A: UTF16 hex to text suggests several solutions.
Changing your business logic not to depend on UTF-16 would be your best long-term option, though. The hexadecimal representation, for instance, is simpler and much more portable.

import csv files on postgres numeric types

I need import a file to my Postgres database and get this error:
invalid input syntax for integer in fabrica, "1";
SQL state: 22P02
my command is:
copy trazabilidade(fabrica, --integer
idChapa, --integer
descricao, --varchar
espessura, --double precision
comprimento, --double precision
largura, --double precision
peso) from 'C:/temp_nexo/traz.csv' delimiter ';';
How can I import data from csv file types that have numbers?
http://wiki.postgresql.org/wiki/COPY
Can not extend Pg coercions
The data-loading mechanism relies on the data being a formal representation of a Pg data-type, or coercible (e.g, cast'able) by Pg. However, there isn't currently a way to add custom-coercions for the Pg types. You can not for instance, make '31,337'::int work by overriding the coercion to an Int.
It also suggests two alternatives, namely pgloader.
pgloader is much better at loading error-prone data in a more flexible format than the built-in COPY is. The downsides are additional install complexity (Python+psycopg+configuration) and a sometimes significant speed loss compared with the built-in COPY.
As per Denis's reply about the COPY command, you can't add custom coercions to postgres copy commands. If pgloader is overkill, you can load your data to a temp table and then from there examine, then cast/trim/manipulate any data you think should be valid.