Variable number of arguments to sql.Query - postgresql

I'm trying to pass a variable number of arguments to db.Query() in Go. I'm doing something like:
var values []interface{}
query := []string{"SELECT * FROM table"}
sep := "WHERE"
if ... {
values = append(values, something)
query = append(query, fmt.Sprintf(` %s field_a=$%d`, sep, len(values))
sep = "AND"
}
if ... {
values = append(values, something)
query = append(query, fmt.Sprintf(` %s field_b=$%d`, sep, len(values))
sep = "AND"
}
// Add an arbitrary number of conditional arguments...
rows, err := db.Query(strings.Join(query, " "), values...)
The query looks fine, and the values are all there, but I'm not getting anything back from the query. Some of the values are integers, some strings. When I manually try the exact same query (copy/paste) in psql, I substituting the actual values for $1, $2, etc, I get correct results. What am I missing?
Edit
Here's an example of what the final query (the result of strings.Join()) should look like:
SELECT
table1.field1, table1.field2, table1.field3,
table2.field4, table2.field6, table2.field6
FROM table1, table2
WHERE
table1.field2=$1 AND
table2.field3 IN ($2, $3)
When I try:
SELECT
table1.field1, table1.field2, table1.field3,
table2.field4, table2.field6, table2.field6
FROM table1, table2
WHERE
table1.field2='something' AND
table2.field3 IN (40, 50)
from psql, it works fine. When I call:
var values []interface{}
values = append(values, "something")
values = append(values, 40)
values = append(values, 50)
db.Query(`SELECT
table1.field1, table1.field2, table1.field3,
table2.field4, table2.field6, table2.field6
FROM table1, table2
WHERE
table1.field2=$1 AND
table2.field3 IN ($2, $3)`, values...)
from Go, I get a Rows object that returns false the first time rows.Next() is called, and a nil error.

Related

postgres - how to do batch update with array in where clause

So,I was hoping to do this:
let statement = "update players set walk_count = unnest($1), x = unnest($2), y = unnest($3) where player_id = unnest($4)";
But the error I get is "message: "set-returning functions are not allowed in WHERE",
The only other way I can solve this is by doing individual updates, but I see the loop is taking a lot of time.
Assuming that each parameter ($1, $2, ...) is an array containing one item for each row you want to update, you should use a single unnest() call for all 4 arrays:
update players
set walk_count = v.wc,
x = v.x,
y = v.y
from (
select *
from unnest($1, $2, $3, $4)
) as v (wc, x, y, id)
where v.id = players.player_id

Update jsonb set new value select from the same table

i have a table foo:
id | items
---+--------------------------------
1 |{"item_1": {"status": "status_1"}}
2 |{"item_2": {"status": "status_2"}}
...
I need to update all rows in column items which is a jsonb and set after {"status": "status"} new values ("new_value": "new_value") and after update the result must look like this:
id | items
---+------------------------------------------------------------
1 |{"item_1": {"status": "status_1", "new_value": "new_value"}}
2 |{"item_2": {"status": "status_2", "new_value": "new_value"}}
...
i've tried to do this:
WITH result AS (
INSERT INTO foo (id, items)
SELECT id, options || newvalue as res
FROM foo AS bar,
jsonb_each(bar.items::jsonb) AS item,
to_jsonb(item.value) AS options,
jsonb_build_object('new_value', 'new_value') as newvalue
WHERE id IN ('1', '2'...)
ON CONFLICT (id)
DO UPDATE
SET items = foo.items || Excluded.items::jsonb RETURNING *)
SELECT item.key AS itemkey
FROM result AS res,
jsonb_each(res.items) AS item,
to_jsonb(item.value) AS options;
but when i run this script the postgres shows this error message:
on conflict do update command cannot affect row a second time postgres
i dont understand what am i doing wrong?
UPDATE#1
Postgres version 9.6
table foo id = TEXT UNIQUE NOT NULL
about why INSERT but not just UPDATE? the answer is this is my mistake first mistake.
after some reading postgres functions finally i find out:
UPDATE foo as t
SET items = result.new_value
FROM (SELECT st.id, new_value
FROM foo AS st
CROSS JOIN LATERAL (
SELECT jsonb_object_agg(the_key, the_value || '{"new_value": "some_new_value"}'::jsonb) AS new_value
FROM jsonb_each(st.items) AS x(the_key, the_value)
LIMIT 1) AS n) AS result
WHERE result.id = t.id;

Go postgres `SELECT * IN` using array

I have a simple select statement:
Select * FROM X where X.name in ("bob", "joe") and X.phone='123'
That works fine in postgres,
In my Go code I have the following code:
var phone string = "123"
var names []string = []string{"bob", "joe"}
sqlStatement := `Select * FROM X where X.name in ($1) and X.phone=$2`
rows, sqlerr := db.Query(sqlStatement, names, phone)
but for some reason I error out from that sql.
unsupported Scan, storing driver.Value type into type *string
how can i use my names array in side the sqlstatement?
note: if i do a fmt.printf and paste the sql statement into postgres, i do get data back + if i take out the $1, and manually input the strings
Copying and pasting fragments from some working Go PostgreSQL code:
import (
"database/sql"
"github.com/lib/pq"
)
query := `
. . .
WHERE code = ANY($1)
. . .
`
codes := []string{"JFK", "LGA", "EWR"}
rows, err := db.Query(query, pq.Array(codes))
I solved this using http://jmoiron.github.io/sqlx/#inQueries
var phone string = "123"
var names []string = []string{"bob", "joe"}
sqlStatement := `Select * FROM X where X.name in (?) and X.phone=?`
sqlStatement, args, err := sqlx.In(sqlStatement, names, phone)
sqlStatement = db.Rebind(sqlStatement)
rows, sqlerr := db.Queryx(sqlStatement, args...)
this now returns correctly.
another way to solve this was to use fmt.Sprintf() and converting the ? to %s

How to insert multiple rows into postgres SQL in a go

Is it possible to insert multiple rows into Postgres database at once? Could someone please suggest if there is a way to insert a slice of slices into database. I have created a slice for each row and created a another slice(multiple rows) by appending all the row slices to it. how do I insert the slice(multiple rows) into db?
When I create a row slice, I'm using row := []interface{}{} . Because I have fields which are strings and int in each row. Looks like I
get an error when I'm inserting data and the error is unsupported type []interface {}, a slice of interface
Implementation:
rowdata := []interface{}{}
row := []interface{}{data.ScenarioUUID, data.Puid, data.Description, data.Status, data.CreatedBy, data.CreatedAt, data.UpdatedBy, data.UpdatedAt, data.ScopeStartsAt, data.ScopeEndsAt, Metric, MetricName, Channel, date, timeRangeValue}
rowdata = append(rowdata, row)
qry2 := `INSERT INTO sample (scenarioUuid,
puId,
description,
status,
createdBy,
createdAt,
updatedBy,
updatedAt,
scopeStartsAt,
scopeEndsAt,
metric,
metric_name,
channel,
time,
value) VALUES ($1, $2, $3,$4,$5,$6,$7,$8,$9,$10,$11,$12,$13,$14,$15)`
if _, err := db.Exec(qry2, rowdata); err != nil {
panic(err)
You could do something like this:
samples := // the slice of samples you want to insert
query := `insert into samples (<the list of columns>) values `
values := []interface{}{}
for i, s := range samples {
values = append(values, s.<field1>, s.<field2>, < ... >)
numFields := 15 // the number of fields you are inserting
n := i * numFields
query += `(`
for j := 0; j < numFields; j++ {
query += `$`+strconv.Itoa(n+j+1) + `,`
}
query = query[:len(query)-1] + `),`
}
query = query[:len(query)-1] // remove the trailing comma
db.Exec(query, values...)
https://play.golang.org/p/YqNJKybpwWB

Psycopg2 insert python dictionary in postgres database

In python 3+, I want to insert values from a dictionary (or pandas dataframe) into a database. I have opted for psycopg2 with a postgres database.
The problems is that I cannot figure out the proper way to do this. I can easily concatenate a SQL string to execute, but the psycopg2 documentation explicitly warns against this. Ideally I wanted to do something like this:
cur.execute("INSERT INTO table VALUES (%s);", dict_data)
and hoped that the execute could figure out that the keys of the dict matches the columns in the table. This did not work. From the examples of the psycopg2 documentation I got to this approach
cur.execute("INSERT INTO table (" + ", ".join(dict_data.keys()) + ") VALUES (" + ", ".join(["%s" for pair in dict_data]) + ");", dict_data)
from which I get a
TypeError: 'dict' object does not support indexing
What is the most phytonic way of inserting a dictionary into a table with matching column names?
Two solutions:
d = {'k1': 'v1', 'k2': 'v2'}
insert = 'insert into table (%s) values %s'
l = [(c, v) for c, v in d.items()]
columns = ','.join([t[0] for t in l])
values = tuple([t[1] for t in l])
cursor = conn.cursor()
print cursor.mogrify(insert, ([AsIs(columns)] + [values]))
keys = d.keys()
columns = ','.join(keys)
values = ','.join(['%({})s'.format(k) for k in keys])
insert = 'insert into table ({0}) values ({1})'.format(columns, values)
print cursor.mogrify(insert, d)
Output:
insert into table (k2,k1) values ('v2', 'v1')
insert into table (k2,k1) values ('v2','v1')
I sometimes run into this issue, especially with respect to JSON data, which I naturally want to deal with as a dict. Very similar. . .But maybe a little more readable?
def do_insert(rec: dict):
cols = rec.keys()
cols_str = ','.join(cols)
vals = [ rec[k] for k in cols ]
vals_str = ','.join( ['%s' for i in range(len(vals))] )
sql_str = """INSERT INTO some_table ({}) VALUES ({})""".format(cols_str, vals_str)
cur.execute(sql_str, vals)
I typically call this type of thing from inside an iterator, and usually wrapped in a try/except. Either the cursor (cur) is already defined in an outer scope or one can amend the function signature and pass a cursor instance in. I rarely insert just a single row. . .And like the other solutions, this allows for missing cols/values provided the underlying schema allows for it too. As long as the dict underlying the keys view is not modified as the insert is taking place, there's no need to specify keys by name as the values will be ordered as they are in the keys view.
[Suggested answer/workaround - better answers are appreciated!]
After some trial/error I got the following to work:
sql = "INSERT INTO table (" + ", ".join(dict_data.keys()) + ") VALUES (" + ", ".join(["%("+k+")s" for k in dict_data]) + ");"
This gives the sql string
"INSERT INTO table (k1, k2, ... , kn) VALUES (%(k1)s, %(k2)s, ... , %(kn)s);"
which may be executed by
with psycopg2.connect(database='deepenergy') as con:
with con.cursor() as cur:
cur.execute(sql, dict_data)
Post/cons?
using %(name)s placeholders may solve the problem:
dict_data = {'key1':val1, 'key2':val2}
cur.execute("""INSERT INTO table (field1, field2)
VALUES (%(key1)s, %(key2)s);""",
dict_data)
you can find the usage in psycopg2 doc Passing parameters to SQL queries
Here is another solution inserting a dictionary directly
Product Model (has the following database columns)
name
description
price
image
digital - (defaults to False)
quantity
created_at - (defaults to current date)
Solution:
data = {
"name": "product_name",
"description": "product_description",
"price": 1,
"image": "https",
"quantity": 2,
}
cur = conn.cursor()
cur.execute(
"INSERT INTO products (name,description,price,image,quantity) "
"VALUES(%(name)s, %(description)s, %(price)s, %(image)s, %(quantity)s)", data
)
conn.commit()
conn.close()
Note: The columns to be inserted is specified on the execute statement .. INTO products (column names to be filled) VALUES ..., data <- the dictionary (should be the same **ORDER** of keys)