psycopg2 - TypeError: not all arguments converted during string formatting - postgresql

I'm using python 3.8 and psycopg2
I'm trying to insert a registry in the database.
I have a function that formats a query and send as result a list with 2 values, one is the query and the other the values.
I made a test and put a fixed value with the exact value of the result list query[1] and worked without error, but when I use the query[1] as values instead the value by itself I got this error:
TypeError: not all arguments converted during string formatting
At my log I have these values for the query list, result of my query construction function.
['INSERT INTO country (code, name, flag, update_time) VALUES(%s,%s,%s,%s)', "('US', 'USA', 'https://example.com/flags/us.svg', 1596551810)"]
query[0]
INSERT INTO country (code, name, flag, update_time) VALUES(%s,%s,%s,%s)
query[1]
('US', 'USA', 'https://example.com/flags/us.svg', 1596551810)
This is the code snipet
`
cursor = connection.cursor()
query_insert = query[0]
query_values = tuple(query[1])
cursor.execute(query_insert,(query_values))
I tried to put it as tuple, use parentheses, but error persists.
If I put the value of the query[1] at my code,as values, work well, so I suppose that the error is at the values part of the cursor.execute parameters.
Any help is welcome !

Related

SQLAlchemy IN_ - trouble with leading zeroes

In my sqlalchemy ( sqlalchemy = "^1.4.36" ) query I have a clause:
.filter( some_model.some_field[2].in_(['item1', 'item2']) )
where some_field is jsonb and the value in some_field value in the db formatted like this:
["something","something","123"]
or
["something","something","0123"]
note: some_field[2] is always digits-only double-quoted string, sometimes with leading zeroes and sometimes without them.
The query works fine for cases like this:
.filter( some_model.some_field[2].in_(['123', '345']) )
and fails when the values in the in_ clause have leading zeroes:
e.g. .filter( some_model.some_field[2].in_(['0123', '0345']) ) fails.
The error it gives:
cursor.execute(statement, parameters)\\npsycopg2.errors.InvalidTextRepresentation: invalid input syntax for type json\\nLINE 3: ...d_on) = 2 AND (app_cache.value_metadata -> 2) IN (\\'0123\\'\\n ^\\nDETAIL: Token \"0123\" is invalid.
Again, in the case of '123' (or any string of digits without leading zero) instead of '0123' the error is not thrown.
What is wrong with having leading zeroes for the strings in the list of in_ clause? Thanks.
UPDATE: basically, sqlachemy's IN_ assumes int input and fails accordingly. There must be some reasoning behind this behavior, can't tell what it is. I removed that filter fromm the query and did the filtering of the ouput in python code afterwards.
The problem here is that the values in the IN clause are being interpreted by PostgreSQL as JSON representations of integers, and an integer with a leading zero is not valid JSON.
The IN clause has a value of type jsonb on the left hand side. The values on the right hand side are not explicitly typed, so Postgres tries to find the best match that will allow them to be compared with a jsonb value. This type is jsonb, so Postgres attempts to cast the values to jsonb. This works for values without a leading zero, because digits in single quotes without leading zeroes are valid representations of integers in JSON:
test# select '123'::jsonb;
jsonb
═══════
123
(1 row)
but it doesn't work for values with leading zeroes, because they are not valid JSON:
test# select '0123'::jsonb;
ERROR: invalid input syntax for type json
LINE 1: select '0123'::jsonb;
^
DETAIL: Token "0123" is invalid.
CONTEXT: JSON data, line 1: 0123
Assuming that you expect some_field[2].in_(['123', '345']) and some_field[2].in_(['0123', '345']) to match ["something","something","123"] and ["something","something","123"] respectively, you can either serialise the values to JSON yourself:
some_field[2].in_([json.dumps(x) for x in ['0123', '345']])
or use the contained_by operator (<# in PostgreSQL), to test whether some_field[2] is present in the list of values:
some_field[2].contained_by(['0123', '345'])
or cast some_field[2] to text (that is, use the ->> operator) so that the values are compared as text, not JSON.
some_field[2].astext.in_(['0123', '345'])

Postgres: Python: TypeError: SQL.__init__() takes 2 positional arguments but 3 were given

Hello I am getting error from my code, can someone help me please?
def query_builder(self, field_name, table_name, pkey, id):
queryx=sql.SQL("select {field} from {table} where {pkey} = %s",(id)).format(
field=sql.Identifier(field_name),
table=sql.Identifier(table_name),
pkey=sql.Identifier(pkey))
self.cur.execute(queryx.as_string(self.conn))
I'm going to assume you are using psycopg2.
If so the issues are, first:
"select {field} from {table} where {pkey} = %s",(id) ..."
Do not include the argument (id) in the string. Also this is not proper form for a single value in a tuple. Python requires it be (id,), note the comma.
Second:
self.cur.execute(queryx.as_string(self.conn))
Should be:
self.cur.execute(queryx, (id,))
The execute is where you supply the argument. Also the composable sql.SQL(...) can be passed directly to execute without being run through as_string. See here sql for more examples.
UPDATE
To use "*" there are two ways:
cur.execute(sql.SQL("select * from {table} where {pkey} = %s).format(table.sql.Identifier(table_name), pkey=sql.Identifier(pkey))
--OR
cur.execute(sql.SQL("select {field} from {table} where {pkey} = %s).format(field=sql.SQL("*"), table=sql.Identifier(table_name), pkey=sql.Identifier(pkey))
Warning, the second does allow for SQL injection as sql.SQL() does not escape values.
As to multiple fields the sql section of the docs has multiple examples. For instance:
If part of your query is a variable sequence of arguments, such as a comma-separated list of field names, you can use the SQL.join() method to pass them to the query:
query = sql.SQL("select {fields} from {table}").format(
fields=sql.SQL(',').join([
sql.Identifier('field1'),
sql.Identifier('field2'),
sql.Identifier('field3'),
]),
table=sql.Identifier('some_table'))

How to use replace query in yii2 updateAll()?

I am using Postgresql. My Yii2 code for the update is
ModelName::updateAll(['my_column' => "REPLACE(my_column1,'removed_','')"]);
Actual query is
update my_table set my_column = REPLACE(my_column1,'removed_','');
When I run my yii2 code it shows the error
SQLSTATE[22001]: String data, right truncated: 7 ERROR: value too long for type character varying(50)
The SQL being executed was: UPDATE "my_table" SET "my_column1"='REPLACE(my_column1,''removed_'','''')'
If you use ['column' => 'value'] syntax for attributes the framework expects that values of array are simple values and treats them accordingly. That's why your expression gets converted to string value instead of using as expression.
If you want to avoid that you need to wrap your values in yii\db\Expression like this:
ModelName::updateAll([
'my_column' => new \yii\db\Expression("REPLACE(my_column1,'removed_','')")
]);

Why is my UPDATE query with jsonb popping an error?

I'm trying to update a row in my PostgreSQL database and it's saying it's not finding the x column. the thing is the column pg is trying to find is actually a parameter for the new value in the jsonb_set function, so I'm at my wits end.
It's hard to explain, so I included the query and the error it throws.
Tried adding quotes, double-quotes, brackets, inside and out... didn't work.
UPDATE public.sometable
SET somecolumn = jsonb_set(somecolumn, '{firstKey, secondKey}', someInputString), update_date=NOW(), update_username="someone#somewhere.com"
WHERE id=1
RETURNING *
I'm expecting the value of the row I'm updating to be returned, instead I get:
ERROR: column "someInputString" does not exist
LINE 1: ...n = jsonb_set(somecolumn , '{firstKey, secondKey}', someInputString)...
You have to deliver a valid json value as the third argument of the function:
UPDATE public.sometable
SET
somecolumn = jsonb_set(somecolumn, '{firstKey, secondKey}', '"someInputString"'),
update_date = now(),
update_username = 'someone#somewhere.com'
WHERE id = 1
RETURNING *
Note, I guess update_username is a text, so you should use single quotes for a simple text.
Db<>fiddle.

How to embed a function returning a string in a Q query?

I'm using Q.f to format column fields from integer to float with 4 digits precision:
fmt_price:{[val] .Q.f[4;](val*0.0001)}
select fmt_price[price] from mytable
The fmt_price works well at the q prompt, but if I embed the function in a query I get this error:
An error occurred during execution of the query. The server sent the
response: `type
The fmt_price call works if I return a float or integer variable, rather than the result of Q.f.
You need to do an each over the list. Currently you are passing a list of values to .Q.f, when it expects an atom. Something like the following is what you need:
fmt_price:{[val] .Q.f[4;] each (val*0.0001)}