How to pass array as a parameter for rowMode="array" in pg-promise - pg-promise

I would like to get the result of a query using rowMode="array" (as this is a potentially very large table and I don't want it formatted to object format) but I couldn't figure out how to pass in a array/list parameter for use in an IN operator.
const events = await t.manyOrNone({text: `select * from svc.events where user_id in ($1:list);`, rowMode: "array"}, [[1,2]]);
However, the above gives an error: syntax error at or near ":"
Removing the :list did not work either:
const events = await t.manyOrNone({text: `select * from svc.events where user_id in ($1);`, rowMode: "array"}, [[1,2]]);
Error: invalid input syntax for integer: "{"1","2"}"
I understand that this might be because I'm forced to use ParameterizedQuery format for rowMode="array" which does not allow those snazzy modifiers like :list, but this then leads to the question, if I were to use ParameterizedQuery format, then how do I natively pass in a Javascript array so that it is acceptable to the driver?
I guess an alternative formulation to this question is: how do I use arrays as parameters for ParameterizedQuery or PreparedStatements...

Answering my own question as I eventually found an answer to this issue: how to pass in arrays as params for use in the IN operator when using rowMode="array" | ParameterizedQuery | PreparedStatements.
Because this query is being parameterized in the server, we cannot use the IN operator, because the IN operator parameterize items using IN ($1, $2, $3...). Instead we need to use the ANY operator, where ANY($1) where for $1 an array is expected.
So the query that will work is:
const events = await t.manyOrNone({text: `select * from svc.events where user_id=ANY($1);`, rowMode: "array"}, [[1,2]]);

Related

Postgres: Python: TypeError: SQL.__init__() takes 2 positional arguments but 3 were given

Hello I am getting error from my code, can someone help me please?
def query_builder(self, field_name, table_name, pkey, id):
queryx=sql.SQL("select {field} from {table} where {pkey} = %s",(id)).format(
field=sql.Identifier(field_name),
table=sql.Identifier(table_name),
pkey=sql.Identifier(pkey))
self.cur.execute(queryx.as_string(self.conn))
I'm going to assume you are using psycopg2.
If so the issues are, first:
"select {field} from {table} where {pkey} = %s",(id) ..."
Do not include the argument (id) in the string. Also this is not proper form for a single value in a tuple. Python requires it be (id,), note the comma.
Second:
self.cur.execute(queryx.as_string(self.conn))
Should be:
self.cur.execute(queryx, (id,))
The execute is where you supply the argument. Also the composable sql.SQL(...) can be passed directly to execute without being run through as_string. See here sql for more examples.
UPDATE
To use "*" there are two ways:
cur.execute(sql.SQL("select * from {table} where {pkey} = %s).format(table.sql.Identifier(table_name), pkey=sql.Identifier(pkey))
--OR
cur.execute(sql.SQL("select {field} from {table} where {pkey} = %s).format(field=sql.SQL("*"), table=sql.Identifier(table_name), pkey=sql.Identifier(pkey))
Warning, the second does allow for SQL injection as sql.SQL() does not escape values.
As to multiple fields the sql section of the docs has multiple examples. For instance:
If part of your query is a variable sequence of arguments, such as a comma-separated list of field names, you can use the SQL.join() method to pass them to the query:
query = sql.SQL("select {fields} from {table}").format(
fields=sql.SQL(',').join([
sql.Identifier('field1'),
sql.Identifier('field2'),
sql.Identifier('field3'),
]),
table=sql.Identifier('some_table'))

How use '?' in #Query? How use Jsonb of postgres in spring-boot at all?

How to use ?| operator in postgres query in spring repository? I need to use where in my query for text type column which content json.
#Query(value =
"SELECT * \n" +
"FROM tbl t \n" +
"WHERE t.some_ids::::jsonb ?| array['152960','188775']", nativeQuery = true
)
List<Model> getModelsByIds();
But that don't work and I catch the next exeception:
org.springframework.dao.InvalidDataAccessApiUsageException: At least 1 parameter(s) provided but only 0 parameter(s) present in query.
You can use the associated function of that operator instead. Most of the time the obfuscation layers also choke on the :: cast operator, so you might want to use cast() instead:
WHERE pg_catalog.jsonb_exists_any(cast(t.some_ids as jsonb), array['152960','188775'])
However I think this wouldn't be able to make use of an index defined on some_ids::jsonb
You didn't mention how exactly the content of some_ids looks like.
If that is a JSON array (e.g. '["123", "456"]'::jsonb) then you can also use the contains operator #>:
WHERE cast(t.some_ids as jsonb) #> '["152960","188775"]'
If your JSON array contains numbers rather than strings ('[123,456]') you need to pass numbers in the argument as well:
WHERE cast(t.some_ids as jsonb) #> '[152960,188775]'

Postgres - append to jsonb string

In Postgres, I have a jsonb column foo which stores an array of strings
["a","b","c"]
I need a query which appends another string to whatever is currently there, at a specified index
e.g. Append "!" at index 1
run query: ["a","b","c"] -> ["a","b!","c"]
run again: ["a","b","c"] -> ["a","b!!","c"]
run again: ["a","b","c"] -> ["a","b!!!","c"]
I've implemented this in Postgres v11.2 as follows
UPDATE my_table
SET foo = jsonb_set(foo, '{1}', CONCAT('"', foo->>1, '!', '"')::jsonb)
WHERE id = '12345';
Note the index 1 and the string '!' are just hardcoded here for simplicity - but they'd be variables.
It works, but I find it quite inelegant. As you can see, I'm selecting out the string at the given index as text using the ->> operator, using that as input to CONCAT to append the '!', and also to build that back into the correct syntax to convert back to a jsonb string. There is just a lot more work going on here than seems necessary, simply to append to a string at a given path.
Is there a simpler way to do this? A built-in function or operator perhaps, or a simpler way of appending than using CONCAT? (I tried using the || operator in various ways but couldn't seem to make anything work with the syntax & types)
I don't think there is a better way than jsonb_set().
The concat can be replaced by || as follows:
jsonb_set(foo, '{1}', ('"' || (foo->>1) || '!"')::jsonb)

ANY operator has significant performance problem when using an array as a parameter

I started using 'ANY()' function in query instead of 'IN' due to some parameter bound error. Currently it's something like that.
Select *
FROM geo_closure_leaf
WHERE geoId = ANY(:geoIds)
But it has a huge impact on performance. Using the query with IN is very much faster than with ANY.
Any suggestion how can we bound array of string parameters can be passed in 'IN' expression.
I have tried temporary fix using
Select *
FROM geo_closure_leaf
WHERE geoId IN (''('' || array_to_string(:geoIds::text[] ,''),('') || '')'')
Select *
FROM geo_closure_leaf
WHERE geoId IN (select unnest(:geoIds::text[]))
geoIds = array of strings
It's working this way.
**public override T Query<T>(string query, IDictionary<string, object> parameters, Func<IDataReader, T> mapper)**
{
T Do(NpgsqlCommand command)
{
IDataReader reader = null;
try
{
** command.CommandText = query;
reader = command.AddParameters(parameters).ExecuteReader();**
return mapper(reader);
}
finally
{
CloseDataReader(reader);
}
}
return Execute(Do);
}
Object is array of string.
Expected is: I should be able to do this without having to put extra logic in sql.
Select *
FROM geo_closure_leaf
WHERE geoId IN (:geoIds)
The performance difference cannot be IN versus = ANY, because PostgreSQL will translate IN into = ANY during query optimization.
The difference must be the subselect. If you are using unnest, PostgreSQL will always estimate that the subquery returns 100 rows, because that is how unnest is defined.
It must be that the estimate of 100 somehow produces a different execution plan that happens to work better.
We'd need the complete execution plans to say anything less uncertain.
https://dba.stackexchange.com/questions/125413/index-not-used-with-any-but-used-with-in
Found this post explaining how indeexs are getting used in different constructors of 'ANY' & 'IN'.

How to get only specific rows on DB, when date range fits SQL condition on a 'tsrange' datatype? [duplicate]

I have this query:
some_id = 1
cursor.execute('
SELECT "Indicator"."indicator"
FROM "Indicator"
WHERE "Indicator"."some_id" = %s;', some_id)
I get the following error:
TypeError: 'int' object does not support indexing
some_id is an int but I'd like to select indicators that have some_id = 1 (or whatever # I decide to put in the variable).
cursor.execute('
SELECT "Indicator"."indicator"
FROM "Indicator"
WHERE "Indicator"."some_id" = %s;', [some_id])
This turns the some_id parameter into a list, which is indexable. Assuming your method works like i think it does, this should work.
The error is happening because somewhere in that method, it is probably trying to iterate over that input, or index directly into it. Possibly like this: some_id[0]
By making it a list (or iterable), you allow it to index into the first element like that.
You could also make it into a tuple by doing this: (some_id,) which has the advantage of being immutable.
You should pass query parameters to execute() as a tuple (an iterable, strictly speaking), (some_id,) instead of some_id:
cursor.execute('
SELECT "Indicator"."indicator"
FROM "Indicator"
WHERE "Indicator"."some_id" = %s;', (some_id,))
Your id needs to be some sort of iterable for mogrify to understand the input, here's the relevant quote from the frequently asked questions documentation:
>>> cur.execute("INSERT INTO foo VALUES (%s)", "bar") # WRONG
>>> cur.execute("INSERT INTO foo VALUES (%s)", ("bar")) # WRONG
>>> cur.execute("INSERT INTO foo VALUES (%s)", ("bar",)) # correct
>>> cur.execute("INSERT INTO foo VALUES (%s)", ["bar"]) # correct
This should work:
some_id = 1
cursor.execute('
SELECT "Indicator"."indicator"
FROM "Indicator"
WHERE "Indicator"."some_id" = %s;', (some_id, ))
Slightly similar error when using Django:
TypeError: 'RelatedManager' object does not support indexing
This doesn't work
mystery_obj[0].id
This works:
mystery_obj.all()[0].id
Basically, the error reads Some type xyz doesn't have an __ iter __ or __next__ or next function, so it's not next(), or itsnot[indexable], or iter(itsnot), in this case the arguments to cursor.execute would need to implement iteration, most commonly a List, Tuple, or less commonly an Array, or some custom iterator implementation.
In this specific case the error happens when the classic string interpolation goes to fill the %s, %d, %b string formatters.
Related:
How to implement __iter__(self) for a container object (Python)
Pass parameter into a list, which is indexable.
cur.execute("select * from tableA where id =%s",[parameter])
I had the same problem and it worked when I used normal formatting.
cursor.execute(f'
SELECT "Indicator"."indicator"
FROM "Indicator"
WHERE "Indicator"."some_id" ={some_id};')
Typecasting some_id to string also works.
cursor.execute(""" SELECT * FROM posts WHERE id = %s """, (str(id), ))