How can I concatenate a string inside of a concatenated jsonb object in postgresql? In other words, I am using the JSONb concatenate operator as well as the text concatenate operator in the same query and running into trouble.
Or... if there is a totally different query I should be executing, I'd appreciate hearing suggestions. The goal is to update a row containing a jsonb column. We don't want to overwrite existing key value pairs in the jsonb column that are not provided in the query and we also want to update multiple rows at once.
My query:
update contacts as c set data = data || '{"geomatch": "MATCH","latitude":'||v.latitude||'}'
from (values (16247746,40.814140),
(16247747,20.900840),
(16247748,20.890570)) as v(contact_id,latitude) where c.contact_id = v.contact_id
The Error:
ERROR: invalid input syntax for type json
LINE 85: update contacts as c set data = data || '{"geomatch": "MATCH...
^
DETAIL: The input string ended unexpectedly.
CONTEXT: JSON data, line 1: {"geomatch": "MATCH","latitude":
SQL state: 22P02
Character: 4573
You might be looking for
SET data = data || ('{"geomatch": "MATCH","latitude":'||v.latitude||'}')::jsonb
-- ^^ jsonb ^^ text ^^ text
but that's not how one should build JSON objects - that v.latitude might not be a valid JSON literal, or even contain some injection like "", "otherKey": "oops". (Admittedly, in your example you control the values, and they're numbers so it might be fine, but it's still a bad practice). Instead, use jsonb_build_object:
SET data = data || jsonb_build_object('geomatch', 'MATCH', 'latitude', v.latitude)
There are two problems. The first is operator precedence preventing your concatenation of a jsonb object to what is read a text object. The second is that the concatenation of text pieces requires a cast to jsonb.
This should work:
update contacts as c
set data = data || ('{"geomatch": "MATCH","latitude":'||v.latitude||'}')::jsonb
from (values (16247746,40.814140),
(16247747,20.900840),
(16247748,20.890570)) as v(contact_id,latitude)
where c.contact_id = v.contact_id
;
Related
In my sqlalchemy ( sqlalchemy = "^1.4.36" ) query I have a clause:
.filter( some_model.some_field[2].in_(['item1', 'item2']) )
where some_field is jsonb and the value in some_field value in the db formatted like this:
["something","something","123"]
or
["something","something","0123"]
note: some_field[2] is always digits-only double-quoted string, sometimes with leading zeroes and sometimes without them.
The query works fine for cases like this:
.filter( some_model.some_field[2].in_(['123', '345']) )
and fails when the values in the in_ clause have leading zeroes:
e.g. .filter( some_model.some_field[2].in_(['0123', '0345']) ) fails.
The error it gives:
cursor.execute(statement, parameters)\\npsycopg2.errors.InvalidTextRepresentation: invalid input syntax for type json\\nLINE 3: ...d_on) = 2 AND (app_cache.value_metadata -> 2) IN (\\'0123\\'\\n ^\\nDETAIL: Token \"0123\" is invalid.
Again, in the case of '123' (or any string of digits without leading zero) instead of '0123' the error is not thrown.
What is wrong with having leading zeroes for the strings in the list of in_ clause? Thanks.
UPDATE: basically, sqlachemy's IN_ assumes int input and fails accordingly. There must be some reasoning behind this behavior, can't tell what it is. I removed that filter fromm the query and did the filtering of the ouput in python code afterwards.
The problem here is that the values in the IN clause are being interpreted by PostgreSQL as JSON representations of integers, and an integer with a leading zero is not valid JSON.
The IN clause has a value of type jsonb on the left hand side. The values on the right hand side are not explicitly typed, so Postgres tries to find the best match that will allow them to be compared with a jsonb value. This type is jsonb, so Postgres attempts to cast the values to jsonb. This works for values without a leading zero, because digits in single quotes without leading zeroes are valid representations of integers in JSON:
test# select '123'::jsonb;
jsonb
═══════
123
(1 row)
but it doesn't work for values with leading zeroes, because they are not valid JSON:
test# select '0123'::jsonb;
ERROR: invalid input syntax for type json
LINE 1: select '0123'::jsonb;
^
DETAIL: Token "0123" is invalid.
CONTEXT: JSON data, line 1: 0123
Assuming that you expect some_field[2].in_(['123', '345']) and some_field[2].in_(['0123', '345']) to match ["something","something","123"] and ["something","something","123"] respectively, you can either serialise the values to JSON yourself:
some_field[2].in_([json.dumps(x) for x in ['0123', '345']])
or use the contained_by operator (<# in PostgreSQL), to test whether some_field[2] is present in the list of values:
some_field[2].contained_by(['0123', '345'])
or cast some_field[2] to text (that is, use the ->> operator) so that the values are compared as text, not JSON.
some_field[2].astext.in_(['0123', '345'])
I am trying to update a JSONB field in one table with data from another table. For example,
update ms
set data = data || '{"COMMERCIAL": 3.4, "PCT" : medi_percent}'
from mix
where mix.id = mss.data_id
and data_id = 6000
and set_id = 20
This is giving me the following error -
Invalid input syntax for type json
DETAIL: Token "medi_percent" is invalid.
When I change medi_percent to a number, I don't get this error.
{"COMMERCIAL": 3.4, "PCT" : medi_percent} is not a valid JSON text. Notice there is no string interpolation happening here. You might be looking for
json_build_object('COMMERCIAL', 3.4, 'PCT', medi_percent)
instead where medi_percent is now an expression (that will presumably refer to your mix column).
In Postgres, I have a jsonb column foo which stores an array of strings
["a","b","c"]
I need a query which appends another string to whatever is currently there, at a specified index
e.g. Append "!" at index 1
run query: ["a","b","c"] -> ["a","b!","c"]
run again: ["a","b","c"] -> ["a","b!!","c"]
run again: ["a","b","c"] -> ["a","b!!!","c"]
I've implemented this in Postgres v11.2 as follows
UPDATE my_table
SET foo = jsonb_set(foo, '{1}', CONCAT('"', foo->>1, '!', '"')::jsonb)
WHERE id = '12345';
Note the index 1 and the string '!' are just hardcoded here for simplicity - but they'd be variables.
It works, but I find it quite inelegant. As you can see, I'm selecting out the string at the given index as text using the ->> operator, using that as input to CONCAT to append the '!', and also to build that back into the correct syntax to convert back to a jsonb string. There is just a lot more work going on here than seems necessary, simply to append to a string at a given path.
Is there a simpler way to do this? A built-in function or operator perhaps, or a simpler way of appending than using CONCAT? (I tried using the || operator in various ways but couldn't seem to make anything work with the syntax & types)
I don't think there is a better way than jsonb_set().
The concat can be replaced by || as follows:
jsonb_set(foo, '{1}', ('"' || (foo->>1) || '!"')::jsonb)
I get this error when querying with a json column:
(psycopg2.ProgrammingError) operator does not exist: json = text
The column is defined as JSON with SQLAlchemy:
json_data = db.Column(db.JSON, nullable=False)
How do you compare with Postgres?
There is no equality (or inequality) operator for the data type json. If you need to test the value as a whole, you might cast to jsonb:
... WHERE json_data::jsonb = jsonb '{}';
Or cast to text for simple cases:
... WHERE json_data::text = '{}';
But there are many valid text representations for the same json value - which is the reason why Postgres does not implement equality / inequality operators for the type.
See:
How to query a json column for empty objects?
Currently I'm trying, in a rails 3 application, to select all objects where two arrays have some overlap.
I've tried the following:
Contact.where("possible_unique_keys && ?" c.possible_unique_keys)
Which gives:
array value must start with "{" or dimension information
So I tried to convert the latter record into a postgresql array, like so:
Contact.where("possible_unique_keys && string_to_array(?)", c.possible_unique_keys)
Which gives:
operator does not exist: character varying[] && text[]
How so I get both arrays to a format that will correctly evaluate &&?
I think I've solved it for now, and come up with a way to build a query from a Ruby array to a postgresql array (thanks in part to this bit of code written for rails/postgresql interoperability.
Given a list of strings ["bill", "joe", "stu", "katie", "laura"], we have to do a little bit in the way of acrobatics to get postgresql to recognize them. This is a solution to construct the request.
request = "SELECT * from Contacts where possible_unique_keys && \'{"
list.each do |key|
key.gsub("'", "\\'")
if key == c.possible_unique_keys.last
request << "\"#{key}\"}\'"
else
request << "\"#{key}\", "
end
end
dupes = Contacts.find_by_sql(request)