Postgresql - querying jsonb throws a syntax error - postgresql

I have a table that has a column data of jsonb type.
create table event
(
id bigserial
primary key,
created_at timestamp with time zone default now() not null,
type text not null,
created_by text,
data jsonb,
event_time timestamp with time zone default now() not null
);
In that field I am saving a json object that looks like this:
{
"comment": "Changed by recipient",
"source": "Recipient page"
}
I would like to query values in that table by the value of the comment property of the data json object. Something like this in based by examples [here][1]:
select * from event
where type = 'pickup-data-changed'
and data -> 'comment' = 'Changed by recipient'
If I query like that I get an invalid token error:
[22P02] ERROR: invalid input syntax for type json Detail: Token "Changed" is invalid. Position: 104
What am I doing wrong here?
If I do it as a double arrow like suggested in the comments:
select * from event
where type = 'pickup-data-changed'
and data ->-> 'comment' = 'Changed by recipient'
I get an error:
[42883] ERROR: operator does not exist: jsonb ->-> unknown Hint: No operator matches the given name and argument types. You might need to add explicit type casts.
How can I make this query work?
[1]: https://kb.objectrocket.com/postgresql/how-to-query-a-postgres-jsonb-column-1433

I get an invalid token error. What am I doing wrong here?
data -> 'comment' returns a value of type jsonb, so the right hand side of the comparison 'Changed by recipient' is parsed as JSON as well - and it's invalid JSON. To create a JSON string value to compare against, you'd need to write
… data -> 'comment' = '"Changed by recipient"'
If I do it as a double arrow like suggested in the comments, data ->-> 'comment'
The comments suggested
… data ->> 'comment' = 'Changed by recipient'
not ->->.

alternatives:
select * from event
where type = 'pickup-data-changed'
and data -> 'comment' = '"Changed by recipient"'::jsonb;
or
select * from event
where type = 'pickup-data-changed'
and data['comment'] = '"Changed by recipient"'::jsonb;

Related

How to append to array field with bulk_update

I have a ArrayField in a Peewee model backed by postgresql DB. How can I append to that field on conflict while upserting in bulk?
Model:
class User(Model):
id = BigAutoField(primary_key=True, unique=True)
tags = ArrayField(CharField, null=True)
SQL query I want to execute:
update user as u set tags = array_append(u.tags, val.tag)
from (values (2, 'test'::varchar)) as val(id, tag)
where u.id = val.id;
I've been trying for to figure out even for single insert, but facing issues in typecasting:
User.insert(user).on_conflict(
conflict_target=[User.id],
update={User.tags: fn.array_append(user['tag'])}
).execute()
Error I'm getting:
function array_append(unknown) does not exist
HINT: No function matches the given name and argument types. You might need to add explicit type casts.
How do I type cast the text to varchar in peewee?
You can try the following:
User.insert(user).on_conflict(
conflict_target=[User.id],
update={User.tags: fn.array_append(user['tag'].cast('varchar'))}
).execute()

How to avoid that 0 (zero) int turns into Postgres "null" value and violates "not null" constraint?

In Go, I am unmarshalling/decoding JSON into a struct with an ID field of type int. Then I try to insert this struct into a PostgreSQL database using go-pg with the ID column as the primary key (which has a not-null constraint). The first entry has a 0 as its ID. In the Postgres documentation, it states that 0 is ok as a value of a primary key. However, I keep getting an error message:
"ERROR #23502 null value in column "number" violates not-null constraint".
It looks like the 0 turns into a Go "zero value" when it is unmarshalled into the int value. Then it is inserted as null value into Postgres. Any tips on how I might be able to avoid this would be greatly appreciated.
type Account struct {
Number int `sql:"type:smallint, pk"`
Name string
}
[...]
account := Account{}
err := json.NewDecoder(r.Body).Decode(&account)
[...]
insertErr := pgLayer.db.Insert(&account)
if insertErr != nil {
log.Printf("Error while inserting new item")
return "n/a", insertErr
}
While it's not immediately obvious with go-pg you can use the struct tag sql:",notnull" to show that Go empty values ("", 0, [] etc.) are allowed and should not be treated as SQL NULL.
You can see it in the Features list.
In your case I would change this to:
type Account struct {
Number int `sql:"type:smallint,pk,notnull"`
Name string
}
I think the easiest solution to your problem is to make your ID column of type SERIAL and let Postgres deal with setting and auto-incrementing the value for you. If you need the value within your application directly after inserting it, you can always use a RETURNING psql clause, like such:
INSERT INTO shows(
user_id, name, description, created, modified
) VALUES(
:user_id, :name, :created, :modified
) RETURNING id;
And capture the response within your code.

How to insert empty array into jsonb column (pgsql) by Yii2?

Created a migration with a new field of jsonb type, not null and default value = []. (example of stored data: ["235", "214"]) and add a rule to model [['unique_users'], 'safe']
public function up()
{
$connection = Yii::$app->getDb();
$sql = 'ALTER TABLE offer ADD unique_users jsonb not null default \'[]\'';
$command = $connection->createCommand($sql, []);
$command->queryAll();
}
Result: Added a unique_users field with a default value [] to each row. jsonb_typeof(unique_users) returns an array type.
Created needed query for test
select jsonb_array_length(unique_users) from test where unique_users #> '"19"'::jsonb
Result from PgAdmin:
It seemed that everything was ready. But after saving a new record with Yii2, I received a query error:
ERROR: you can not get the length of a scalar
And I saw that another value was recorded in the field - ""
I was tryed to add the validation rule to Model: ['unique_users', 'default', 'value' => '[]'],.
Result:
...with the same problem of query - value is not an array. jsonb_typeof(unique_users) returns an string type.
How to insert empty array into jsonb column?
I think you're accidentally sending an empty string as the value for your unique_users field. If the value would be completely empty it should take the default DB value for the column. Please make sure the unique_users field is completely empty (null) when saving.
You can however also do this with a default value rule. This should do the trick:
['unique_users', 'default', 'value' => json_encode([])],
['unique_users', 'default', 'value' => []],

Updating an array of objects fields in crate

I created a table with following syntax:
create table poll(poll_id string primary key,
poll_type_id integer,
poll_rating array(object as (rating_id integer,fk_user_id string, israted_image1 integer, israted_image2 integer, updatedDate timestamp, createdDate timestamp )),
poll_question string,
poll_image1 string,
poll_image2 string
)
And I inserted a record without "poll_rating" field which is actually an array of objects fields.
Now when I try to update a poll_rating with the following commands:
update poll set poll_rating = [{"rating_id":1,"fk_user_id":-1,"israted_image1":1,"israted_image2":0,"createddate":1400067339.0496}] where poll_id = "f748771d7c2e4616b1865f37b7913707";
I'm getting an error message like this:
"SQLParseException[line 1:31: no viable alternative at input '[']; nested: ParsingException[line 1:31: no viable alternative at input '[']; nested: NoViableAltException;"
Can anyone tell me why I get this error when I try to update the array of objects fields.
Defining arrays and objects directly in SQL statement is currently not supported by our SQL parser, please use parameter substitution using placeholders instead as described here:
https://crate.io/docs/current/sql/rest.html
Example using curl is as below:
curl -sSXPOST '127.0.0.1:4200/_sql?pretty' -d#- <<- EOF
{"stmt": "update poll set poll_rating = ? where poll_id = ?",
"args": [ [{"rating_id":1,"fk_user_id":-1,"israted_image1":1,"israted_image2":0,"createddate":1400067339.0496}], "f748771d7c2e4616b1865f37b7913707" ]
}
EOF

How can I use an hstore column type with Npgsql?

I have a table with the following schema:
CREATE TABLE account
(
id serial primary key,
login varchar(40) not null,
password varchar(40) not null,
data hstore
);
I'd like to use an NpgsqlCommand object with parameters to retrieve and store the account data from my application. Which DbType do I have to use for the NpgsqlParameter? The enum NpgsqlDbType does not have a value for hstore. Can I use a Dictionary or HashTable as value of the NpgsqlParameter object?
When I use a JSON column I can create a parameter of type NpgsqlDbType.Text, use a library like JSON.Net to serialize an object to a JSON string and send an SQL statement like that:
INSERT INTO account (login, password, data) VALUES (:login, :password, :data::json)
Unfortunately this does not work with an hstore column. I get a syntax error when I try to do this:
INSERT INTO account (login, password, data) VALUES (:login, :password, :data::hstore)
The string I pass to the data parameter looks like this:
'key1 => "value1", key2 => "value2"'
Thank you, Francisco! I saw in the log that the single quotes (') at the beginning and the end of the string are escaped when they are passed to PostgreSQL. When I pass
key1 => "value1", key2 => "value2"
instead, I can insert the data into the hstore column.