Bulk INSERT in Postgres in GO using pgx - postgresql

I am trying to bulk insert keys in db in go here is the code
Key Struct
type tempKey struct {
keyVal string
lastKey int
}
Test Keys
data := []tempKey{
{keyVal: "abc", lastKey: 10},
{keyVal: "dns", lastKey: 11},
{keyVal: "qwe", lastKey: 12},
{keyVal: "dss", lastKey: 13},
{keyVal: "xcmk", lastKey: 14},
}
Insertion part
dbUrl := "db url...."
conn, err := pgx.Connect(context.Background(), dbUrl)
if err != nil {
println("Errrorr...")
}
defer conn.Close(context.Background())
sqlStr := "INSERT INTO keys (keyval,lastval) VALUES "
dollars := ""
vals := []interface{}{}
count := 1
for _, row := range data {
dollars = fmt.Sprintf("%s($%d, $%d),", dollars, count, count+1)
vals = append(vals, row.keyVal, row.lastKey)
count += 2
}
sqlStr += dollars
sqlStr = sqlStr[0 : len(sqlStr)-1]
fmt.Printf("%s \n", sqlStr)
_, erro := conn.Exec(context.Background(), sqlStr, vals)
if erro != nil {
fmt.Fprint(os.Stderr, "Error : \n", erro)
}
on running it throws error: expected 10 arguments, got 1
What is the correct way of bulk inserting.

You are crafting the SQL statement by hand, which is fine, but you are not leveraging pgx which can help with this (see below).
Appending to the SQL string like so can be inefficient for large inputs
dollars = fmt.Sprintf("%s($%d, $%d),", dollars, count, count+1)
but also the final value has a trailing , where instead you need a termination character ; to indicate the end of the statement.
BTW this string truncation line is redundant:
sqlStr = sqlStr[0 : len(sqlStr)-1] // this is a NOOP
Anyway, better to use something more performant like strings.Builder when crafting long strings.
From the pgx docs, use pgx.Conn.CopyFrom:
func (c *Conn) CopyFrom(tableName Identifier, columnNames []string, rowSrc CopyFromSource) (int, error)
CopyFrom uses the PostgreSQL copy protocol to perform bulk data
insertion. It returns the number of rows copied and an error.
example usage of Copy:
rows := [][]interface{}{
{"John", "Smith", int32(36)},
{"Jane", "Doe", int32(29)},
}
copyCount, err := conn.CopyFrom(
pgx.Identifier{"people"},
[]string{"first_name", "last_name", "age"},
pgx.CopyFromRows(rows),
)

use batch (https://github.com/jackc/pgx/blob/master/batch_test.go):
batch := &pgx.Batch{}
batch.Queue("insert into ledger(description, amount) values($1, $2)", "q1", 1)
batch.Queue("insert into ledger(description, amount) values($1, $2)", "q2", 2)
br := conn.SendBatch(context.Background(), batch)

Related

How to get row value(s) back after db insert?

I am using Golang to insert data into a DB. basically my query looks like below
var cols = "(col1, col2, col3)"
var values = "($1, $2, $3)"
var query = fmt.Sprintf("INSERT INTO %s %s VALUES %s", myTable, cols, values)
res, err := db.Exec(query, thing.val1, thing.val2, thing.val3)
The only things available from res are lastInsertId and # of rows affected. But what I need is the rows affected. The reason being is that I insert data into a psql database which has an AUTOINCREMENT id column - so I want the data back with that.
For example - with Java hibernate I can do what this answer explains. I don't have to re-query the DB for the ID.
EDIT: I tried to use the lastInsertId method and got this error
LastInsertId is not supported by this driver
Assuming you just want the auto-incremented value(s) in a column called id and this is an insert with the pq driver
var cols = "(col1, col2, col3)"
var values = "($1, $2, $3)"
var query = fmt.Sprintf(
"INSERT INTO %s %s VALUES %s RETURNING id",
myTable, cols, values,
)
var id int
if err := db.QueryRow(
query,
thing.val1, thing.val2, thing.val3,
).Scan(&id); err != nil {
panic(err)
}
fmt.Println("ID: ", id)
For multiple inserts:
var cols = "(col1, col2, col3)"
var values = "($1, $2, $3),($4, $5, $6)"
var query = fmt.Sprintf(
"INSERT INTO %s %s VALUES %s RETURNING id",
myTable, cols, values,
)
var ids []int
rows, err := db.Query(
query,
thing.val1, thing.val2, thing.val3,
thing.val4, thing.val5, thing.val6,
)
if err != nil {
panic(err)
}
for rows.Next() {
var id int
if err := rows.Scan(&id); err != nil {
panic(err)
}
ids = append(ids, id)
}
fmt.Println("IDs: ", ids)
res.LastInsertId() is not supported in Postgres Driver. However, It is supported in MySQL Driver.
db.Exec() doesn't return last inserted id but db.QueryRow() does.
For better understanding you can refer this link.
Here, I added one example which might help you.
var id int
err := db.QueryRow("INSERT INTO user (name) VALUES ('John') RETURNING id").Scan(&id)
if err != nil {
...
}

Converting MongoDB $max result to golang data

I try to get max values from MongoDB collection from my Go code.
What type should I use to decode result?
When I use bson.D{} as val2 type the result looks like [{_id <nil>} {max 66} {cnt 14}].
Here's the code:
filter := []bson.M{{
"$group": bson.M{
"_id": nil,
"max": bson.M{"$max": "$hellid"},
}},
}
cursor, err := collection.Aggregate(ctx, filter)
for cursor.Next(ctx) {
val2 := ???
err := cursor.Decode(&val2)
fmt.Printf("cursor: %v, value: %v\n", cursor.Current, val2)
}
}
Using bson.D already works as you presented. The problem may be you can't "easily" get out the max and cnt values.
Model your result document with a struct like this:
type result struct {
Max int `bson:"max"`
Count int `bson:"cnt"
}
Although cnt is not produced by the example code you provided.
And then:
var res result
err := cursor.Decode(&res)

Inserting array of custom types into postgres

I'm trying to insert a row with a column that is an array of a custom type (ingredient). My tables are:
CREATE TYPE ingredient AS (
name text,
quantity text,
unit text
);
CREATE TABLE IF NOT EXISTS recipes (
recipe_id uuid PRIMARY KEY DEFAULT gen_random_uuid(),
name text,
ingredients ingredient[],
// ...
);
Using raw sql, I can insert a row by:
INSERT INTO recipes (name, ingredients) VALUES ('some_name', ARRAY[ROW('aa', 'bb', 'cc'), ROW('xx', 'yy', 'zz')]::ingredient[] );
But I'm struggling to do this in go with the pq lib. I've created a pq.Array interface:
type Ingredient struct {
Name string
Quantity string
Unit string
}
type Ingredients []*Ingredient
func (ings *Ingredients) ConvertValue(v interface{}) (driver.Value, error) {
return "something", nil
}
func (ings *Ingredients) Value() (driver.Value, error) {
val := `ARRAY[]`
for i, ing := range ings {
if i != 0 {
val += ","
}
val += fmt.Printf(`ROW('%v','%v','%v')`, ing.Name, ing.Quantity, ing.Unit)
}
val += `::ingredient[]`
return val, nil
}
// and then trying to insert via:
stmt := `INSERT INTO recipes (
name,
ingredients
)
VALUES ($1, $2)
`
_, err := db.Exec(stmt,
"some_name",
&Ingredients{
&Ingredient{"flour", "3", "cups"},
},
)
But pg keeps throwing the error:
Error insertingpq: malformed array literal: "ARRAY[ROW('flour','3','cups')]::ingredient[]"
Am I returning an incorrect driver.Value?
You can either use this approach outlined here: https://github.com/lib/pq/issues/544
type Ingredient struct {
Name string
Quantity string
Unit string
}
func (i *Ingredient) Value() (driver.Value, error) {
return fmt.Sprintf("('%s','%s','%s')", i.Name, i.Quantity, i.Unit), nil
}
stmt := `INSERT INTO recipes (name, ingredients) VALUES ($1, $2::ingredient[])`
db.Exec(stmt, "some_name", pq.Array([]*Ingredient{{"flour", "3", "cups"}}))
Or if you have records in the table and you query it, you will probably see the ingredient array in its literal form, which you can than mimic during insert.
func (ings *Ingredients) Value() (driver.Value, error) {
val := `{`
for i, ing := range ings {
if i != 0 {
val += ","
}
val += fmt.Sprintf(`"('%s','%s','%s')"`, ing.Name, ing.Quantity, ing.Unit)
}
val += `}`
return val, nil
}
// e.g. `{"('flour','3','cups')"}`
stmt := `INSERT INTO recipes (name, ingredients) VALUES ($1, $2::ingredient[])`
// ...
It seems your database design is very complicated and not taking into account the strengths (and weaknesses) of SQL.
May I suggest you split the ingredients into their own table with a reference to the recipe. Then finding out a full recipe is a JOIN operation.
Creating the DB:
CREATE TABLE ingredients (
recipe_id uuid,
name text,
quantity int,
unit text
);
CREATE TABLE recipes (
recipe_id uuid PRIMARY KEY,
name text
);
Inserting a recipe and querying to read it out:
INSERT INTO recipes VALUES (
'5d1cb631-37bd-46cc-a278-4c8558ed8964', 'cake1'
);
INSERT INTO ingredients (recipe_id, name, quantity, unit) VALUES
('5d1cb631-37bd-46cc-a278-4c8558ed8964', 'flour', 3, 'cups'),
('5d1cb631-37bd-46cc-a278-4c8558ed8964', 'water', 1, 'cups')
;
SELECT r.name, i.name, i.quantity, i.unit
FROM ingredients AS i
INNER JOIN recipes AS r
ON r.recipe_id=i.recipe_id;
SQLFiddle link: http://sqlfiddle.com/#!17/262ad/14

Golang Postgresql Array

If I have a table that returns something like:
id: 1
names: {Jim, Bob, Sam}
names is a varchar array.
How do I scan that back into a []string in Go?
I'm using lib/pg
Right now I have something like
rows, err := models.Db.Query("SELECT pKey, names FROM foo")
for rows.Next() {
var pKey int
var names []string
err = rows.Scan(&pKey, &names)
}
I keep getting:
panic: sql: Scan error on column index 1: unsupported Scan, storing driver.Value type []uint8 into type *[]string
It looks like I need to use StringArray
https://godoc.org/github.com/lib/pq#StringArray
But, I think I'm too new to Go to understand exactly how to use:
func (a *StringArray) Scan(src interface{})
You are right, you can use StringArray but you don't need to call the
func (a *StringArray) Scan(src interface{})
method yourself, this will be called automatically by rows.Scan when you pass it anything that implements the Scanner interface.
So what you need to do is to convert your []string to *StringArray and pass that to rows.Scan, like so:
rows, err := models.Db.Query("SELECT pKey, names FROM foo")
for rows.Next() {
var pKey int
var names []string
err = rows.Scan(&pKey, (*pq.StringArray)(&names))
}
Long Story Short, use like this to convert pgSQL array to GO array, here 5th column is coming as a array :
var _temp3 []string
for rows.Next() {
// ScanRows scan a row into temp_tbl
err := rows.Scan(&_temp, &_temp0, &_temp1, &_temp2, pq.Array(&_temp3))
In detail :
To insert a row that contains an array value, use the pq.Array function like this:
// "ins" is the SQL insert statement
ins := "INSERT INTO posts (title, tags) VALUES ($1, $2)"
// "tags" is the list of tags, as a string slice
tags := []string{"go", "goroutines", "queues"}
// the pq.Array function is the secret sauce
_, err = db.Exec(ins, "Job Queues in Go", pq.Array(tags))
To read a Postgres array value into a Go slice, use:
func getTags(db *sql.DB, title string) (tags []string) {
// the select query, returning 1 column of array type
sel := "SELECT tags FROM posts WHERE title=$1"
// wrap the output parameter in pq.Array for receiving into it
if err := db.QueryRow(sel, title).Scan(pq.Array(&tags)); err != nil {
log.Fatal(err)
}
return
}
Note: that in lib/pq, only slices of certain Go types may be passed to pq.Array().
Another example in which varchar array in pgSQL in generated at runtime in 5th column, like :
--> predefined_allow false admin iam.create {secrets,configMap}
I converted this as,
Q := "SELECT ar.policy_name, ar.allow, ar.role_name, pro.operation_name, ARRAY_AGG(pro.resource_id) as resources FROM iam.authorization_rules ar LEFT JOIN iam.policy_rules_by_operation pro ON pro.id = ar.operation_id GROUP BY ar.policy_name, ar.allow, ar.role_name, pro.operation_name;"
tx := g.db.Raw(Q)
rows, _ := tx.Rows()
defer rows.Close()
var _temp string
var _temp0 bool
var _temp1 string
var _temp2 string
var _temp3 []string
for rows.Next() {
// ScanRows scan a row into temp_tbl
err := rows.Scan(&_temp, &_temp0, &_temp1, &_temp2, pq.Array(&_temp3))
if err != nil {
return nil, err
}
fmt.Println("Query Executed...........\n", _temp, _temp0, _temp1, _temp2, _temp3)
}
Output :
Query Executed...........
predefined_allow false admin iam.create [secrets configMap]

Using postgres IN clause in golang

I have been trying to use the postgres IN clause in golang, but keep getting errors. This is the query I want to execute.
SELECT id1 FROM my_table WHERE type = (an int) AND id2 = (an int) AND id1 IN (list of UUIDs)
I used this code to construct this query but got the following error.
var params []interface{}
inCondition := ""
params = append(params, type)
params = append(params, id2)
for _, id := range id1 {
params = append(params, id)
if inCondition != "" {
inCondition += ", "
}
inCondition += "?"
}
query := fmt.Sprintf(`SELECT id1 FROM my_table WHERE type = ? AND id2 = ? AND id1 IN (%s)`, inCondition)
rows, err := db.Query(query, params...)
Query I got:
SELECT id1 FROM my_table WHERE type = ? AND id2 = ? AND id1 IN (?, ?, ?)
Params output:
[]interface {}=[0 7545449 d323f8d5-ab97-46a3-a34e-95ceac2f3a6a d323f8d5-ab97-46a3-a34e-95ceac2f3a6b d323f8d5-ab97-46a3-a34e-95ceac2f3a6d]
Error:
pq: syntax error at or near \"AND\""
What am I missing? or, how will I get this to work? id1 is a slice of UUIDs whose length is variable.
Ran into a similar issue. Can't remember where exactly I picked this up but I remember still running into issues when dealing with arrays of type integer vs string. What I had to do was to have a local custom type and return a driver compatible value for it. See sample below.
// Int64Array is a type implementing the sql/driver/value interface
// This is due to the native driver not supporting arrays...
type Int64Array []int64
// Value returns the driver compatible value
func (a Int64Array) Value() (driver.Value, error) {
var strs []string
for _, i := range a {
strs = append(strs, strconv.FormatInt(i, 10))
}
return "{" + strings.Join(strs, ",") + "}", nil
}
Highly recommend checking out sqlx. Wrote a simple orm wrapper called papergres to make my go + postgres life easier :) Give it a try.
Instead of ?, using $1,$2 etc as placeholder worked.