Grails byte array and PostgreSQL - postgresql

I'm trying to implement the Simple Avatar Uploader on my User domain class but I seem to have encountered a conflicting issue with grails implementation of byte[] and PostgreSQL. I have implemented it exactly as the plugin page suggests but on compilation I get the error:
Error: Error executing SQL ALTER TABLE user ADD avatar bytea(16384): ERROR: type modifier is not allowed for type "bytea"
I have found some help suggesting that PostgreSQL does not accept a size modifier but removing the maxSize: 16384 constraint only leads to the exact same error with a different size:
Error: Error executing SQL ALTER TABLE user ADD avatar bytea(255): ERROR: type modifier is not allowed for type "bytea"
So it seems that grails will automatically set the size to 255 if no maxSize is provided. Is there a way to override this? Or perhaps a more suitable data type for byte arrays?
Thanks!

Not sure if it was directly responsible or not but we are using Grails Database Migration and we solved the issue by editing the latest migration script changing the line
column(name: "avatar", type: "bytea(255)")
to
column(name: "avatar", type: "bytea")

Related

Postgresql push uuid field to Clickhouse data type error

I am trying to push data from PG-12 to Clickhouse via FDW extension ("clickhouse_fdw") and it gives me error:
SQL Error [HV004]: ERROR: cannot convert constant value to clickhouse
value Hint: Constant value data type: 2950
Sherlocking gave me error in uuid, excepting this field passes on to data transfer. If I cast it to text it gives error, that target field is uuid not text. I am confused. Which data type casting is appropriate here? Please advise DB gurus :) Thanks beforehands.

Problems with setting the dataType in the dBeaver

Whenever I try to choose a data type for any column I get this message. I don't type by myself but choose from drop-down menu instead.
Details message as follows,
Error setting property 'dataType' value
Bad data type name specified: serial
it can be any type instead of serial, the result message will be the same.
The database is PostgreSQL.
Same problem :(
Dbeaver 5.2.5, Fedora 29, PostgreSQL 9.6

Issue with UUID datatype when laoding postgres table using Apache NiFi

Database
Postgres 9.6
Contains several tables that have a UUID column (containing the ID of each record)
NiFi
Latest release (1.7.1)
Uses Avro 1.8.1 (as far as I know)
Problem description
When scheduling the tables using the ExecuteSQL processor, the following error message occurs:
ExecuteSQL[id=09033e32-e840-1aed-3062-6e8cbc5551ba] ExecuteSQL[id=09033e32-e840-1aed-3062-6e8cbc5551ba] failed to process session due to createSchema: Unknown SQL type 1111 / uuid (table: country, column: id) cannot be converted to Avro type; Processor Administratively Yielded for 1 sec: java.lang.IllegalArgumentException: createSchema: Unknown SQL type 1111 / uuid (table: country, column: id) cannot be converted to Avro type
Note that the flowfiles aren't removed from the incoming queue, nor sent to the 'failure' relationship, resulting in an endless loop of failing attempts.
Attempts to fix issue
I tried enabling the Use Avro Logical Types property of the ExecuteSQL processor, but the same error occurred.
Possible but not preferred solutions
I currently perform a SELECT * from each table. A possible solution (I think) would be to specify each column, and have the query cast the uuid to a string. Though this could work, I'd strongly prefer not having to list every column separately.
A last note
I did find this Jira ticket: https://issues.apache.org/jira/browse/AVRO-1962
However, I'm not sure how to interpret this. Is it implemented or not? Should it work or not?
I believe UUID is not a standard JDBC type and is specific to Postgres.
The JDBC types class shows that SQL type 1111 is "OTHER":
/**
* The constant in the Java programming language that indicates
* that the SQL type is database-specific and
* gets mapped to a Java object that can be accessed via
* the methods <code>getObject</code> and <code>setObject</code>.
*/
public final static int OTHER = 1111;
So I'm not sure how NiFi could know what to do here because it could be anything depending on the type of DB.
Have you tried creating a view where you define the column as ::text?
SELECT
"v"."UUID_COLUMN"::text AS UUID_COLUMN
FROM
...

Is it possible to the read data from bytea type field using getBlob() method in postgresql9.2

I am using postgresql 9.2 and ,when I read the data from Bytea(used to store binary large object data in postgresql) field using getBlob() method but I'm getting the same error
Error: type bad value for long.
I know that I'm getting this error because of only getBinaryStream() and getBytes() method works with this(Bytea) and getBlob() method works with OID type field. But I have situation.So I would like to know that, is there any way to read the data from Bytea using getBlob() method in Postgresql9.2 . Thank you in advance.

xx' property on 'yyy could not be set to a 'String' value. You must set this property to a non-null value of type 'Int32'

I am facing this problem due to unknown reason and I have tried every forum and blog for solving this but could not get any satisfactory answer for this.
Let me describe the scenario.
I have a view in database which is consisting columns from two tables. None of the tables have any column with data type "int" hence the resultant view (let's name is "MyRecord") also does not have any column with "int" data types. All the columns in the view have varchar as datatype.
Now, in my .edmx I am adding this view and the model is created (with name "MyRecord") fine with all the properties are created fine with datatype "String". I am using Silverlight with RIA services, to after builing the application related proxies are also created fine without any confiction.
The problem starts when I try to query the "MyRecord" using my domain context, I am getting following error.
Load operation failed for query 'GetMyRecords'. The 'CenterCode' property on 'MyRecord' could not be set to a 'String' value. You must set this property to a non-null value of type 'Int32'.
As seen in the error, it is clearly forcing me to convert data type of "string" column "CenterCode" to the "Int32" which is totally useless and unnecessary for me. The "String" or "varchar" columns are there because they have some business importance and changing them to "Int32" or "int" might break the application in future. Its true that "CenterCode" column has numeric data only in it but there can be character data in future thats why it is created with 'varchar' datatype.
I can not change type of my data just because EF is not supporting.
I used sql server profiler, the query is being executed correct and I can run the same query in SSMS without any error. The error comes in the application only when EF is building objects from the data returned by the query.
I am failed to understand why Entity Framework is throwing this error, it is simply not converting "varchar" to "String" and unnecessarily bringing "Int32" in picture and making the life difficult. I am struggling with this issue since last 4 hours and tried every possible way to resolve it but everything is in vein.
Please provide some information or solution on this if anyone is having it.
EF team, you must have some answer to this question or work around for this problem.
I had the same problem with a double datatype.
Solution:
Change your view/procedure and cast the column like:cast(columnname as int32)
Not sure if you solved this problem or not, but I just ran into something like this while working on multiple result sets with EF. In my case, I had reader.NextResult() that was causing a problem for me because I hadn't read all the records from the previous result and I think EF was failing due to trying to map data from the second result set into the first object.
CAST(columnName as Type) solve my problem in stored procedure.