Using postgres IN clause in golang - postgresql

I have been trying to use the postgres IN clause in golang, but keep getting errors. This is the query I want to execute.
SELECT id1 FROM my_table WHERE type = (an int) AND id2 = (an int) AND id1 IN (list of UUIDs)
I used this code to construct this query but got the following error.
var params []interface{}
inCondition := ""
params = append(params, type)
params = append(params, id2)
for _, id := range id1 {
params = append(params, id)
if inCondition != "" {
inCondition += ", "
}
inCondition += "?"
}
query := fmt.Sprintf(`SELECT id1 FROM my_table WHERE type = ? AND id2 = ? AND id1 IN (%s)`, inCondition)
rows, err := db.Query(query, params...)
Query I got:
SELECT id1 FROM my_table WHERE type = ? AND id2 = ? AND id1 IN (?, ?, ?)
Params output:
[]interface {}=[0 7545449 d323f8d5-ab97-46a3-a34e-95ceac2f3a6a d323f8d5-ab97-46a3-a34e-95ceac2f3a6b d323f8d5-ab97-46a3-a34e-95ceac2f3a6d]
Error:
pq: syntax error at or near \"AND\""
What am I missing? or, how will I get this to work? id1 is a slice of UUIDs whose length is variable.

Ran into a similar issue. Can't remember where exactly I picked this up but I remember still running into issues when dealing with arrays of type integer vs string. What I had to do was to have a local custom type and return a driver compatible value for it. See sample below.
// Int64Array is a type implementing the sql/driver/value interface
// This is due to the native driver not supporting arrays...
type Int64Array []int64
// Value returns the driver compatible value
func (a Int64Array) Value() (driver.Value, error) {
var strs []string
for _, i := range a {
strs = append(strs, strconv.FormatInt(i, 10))
}
return "{" + strings.Join(strs, ",") + "}", nil
}
Highly recommend checking out sqlx. Wrote a simple orm wrapper called papergres to make my go + postgres life easier :) Give it a try.

Instead of ?, using $1,$2 etc as placeholder worked.

Related

How to use pgcrypto with go-pg for column encryption?

Could you help me please with my problem. I'm trying to use column encryption for postgres and there is one small question: how I can achieve transformation value of column (ex: "test_value") to PGP_SYM_ENCRYPT('test_value', 'KEY') in "insert" sql query?
As I understood, the custom types can be the solution for me, but some things isn't clear... Maybe anyone has an example for my case?
(I see this aws docs about pgcrypto using: https://docs.aws.amazon.com/dms/latest/sql-server-to-aurora-postgresql-migration-playbook/chap-sql-server-aurora-pg.security.columnencryption.html)
What I did:
type sstring struct {
string
}
var _ types.ValueAppender = (*sstring)(nil)
func (tm sstring) AppendValue(b []byte, flags int) ([]byte, error) {
if flags == 1 {
b = append(b, '\'')
}
b = []byte("PGP_SYM_ENCRYPT('123456', 'AES_KEY')")
if flags == 1 {
b = append(b, '\'')
}
return b, nil
}
var _ types.ValueScanner = (*sstring)(nil)
func (tm *sstring) ScanValue(rd types.Reader, n int) error {
if n <= 0 {
tm.string = ""
return nil
}
tmp, err := rd.ReadFullTemp()
if err != nil {
return err
}
tm.string = string(tmp)
return nil
}
type model struct {
ID uint `pg:"id"`
Name string `pg:"name"`
Crypto sstring `pg:"crypto,type:sstring"`
tableName struct{} `pg:"models"`
}
----------
_, err := r.h.ModelContext(ctx, model).Insert()
And... the process just do nothing. Do not respond, do not fall, do not create row in sql table.. Nothing.
Anyway. My question is how to implement wrap some column by sql function using pg-go orm.
I tried to use https://github.com/go-pg/pg/blob/v10/example_custom_test.go#L13-L49 as custom type handler example.. But smth went wrong. =(

Go: Unable to unmarshal json value

I have jsonb value stored in postgres columns which I need to convert to object to send the response
I am unable to convert JSON -> struct using json.Unmarshal method
type Result struct {
Id int
Expertise string // because json string
Languages string // because json string
}
// saving dd query data to result struct
e, _ := json.Marshal(data.Expertise)
l, _ := json.Marshal(data.Languages)
row := r.db.QueryRow(`
INSERT INTO filters (user_id, expertise, languages)
VALUES ($1, $2, $3)
ON CONFLICT (user_id) DO
UPDATE SET expertise=$2, languages=$3
RETURNING id, expertise, languages;
`, userid, e, l)
var res Result
err := row.Scan(&res.Id, &res.Expertise, &res.Languages)
Then I take this the Expertise and Languages field and UNMARSHAL THEM
// helper method
func Unmarshal(value string) interface{} {
var obj interface{}
err := json.Unmarshal([]byte(value), &obj)
return obj
}
type CreateFilterResponse struct {
Id int `json:"id"`
Expertise interface{} `json:"expertise"`
Languages interface{} `json:"languages"`
}
response := dto.CreateFilterResponse{
Id: res.Id,
Expertise: Unmarshal(res.Expertise), // 👈 not decoding, still json
Languages: Unmarshal(res.Languages), // 👈 not decoding, still json
}
I would really appreciate the help, I need the Expertise and Languages to be {} and [] respectively
This is what I am getting in response:
{"id":5,"expertise":"[{\"role\":1,\"experience\":5},
{\"role\":3,\"experience\":4}]","languages":"[1,2,3]"}
Note:
Expertise is jsonb column : []{role: int, experience: int}
Languages is jsonb column : []int
Thanks to #Blackgreen comment, the suggestion worked.
As he suggested "I believe the issue is that you are scanning the jsonb bytes into string, which causes it to be interpreted literally. Then when you pass a string into json.Unmarshal it stays a string. Change the type of Expertise and Language to []byte (or json.RawMessage)"
I did the same and it works, here is the code:
// for parsing jsonb coming from db query
type Result struct {
Id int
Expertise json.RawMessage // change
Languages json.RawMessage // change
}
// the database query itself
e, _ := json.Marshal(filters.Expertise)
l, _ := json.Marshal(filters.Languages)
row := r.db.QueryRow(`
INSERT INTO filters (user_id, expertise, languages)
VALUES ($1, $2, $3)
ON CONFLICT (user_id) DO
UPDATE SET expertise=$2, languages=$3
RETURNING id, expertise, languages;
`, userid, e, l)
var res Result
err := row.Scan(&res.Id, &res.Expertise, &res.Languages) // scanning to result
if err != nil {
fmt.Println("Unable to create a new filter ==>", err)
return nil, err
}
Then I Unmarshalled the expertise and languages jsonb values using a helper method and created a Response struct for the client:
// response struct type
type CreateFilterResponse struct {
Id int `json:"id"`
Expertise interface{} `json:"expertise"`
Languages interface{} `json:"languages"`
}
// final response struct
response := dto.CreateFilterResponse{
Id: res.Id,
Expertise: Unmarshal(res.Expertise),
Languages: Unmarshal(res.Languages),
}
// helper method
func Unmarshal(value []byte) interface{} {
var obj interface{}
json.Unmarshal(value, &obj)
return obj
}
This works for now. I still need to find an easy way to do this as this is too much boilerplate code to perform a simple task. like in JS/TS it can be done using single line of code JSON.parse(value)
The problem is in the json output. It is encoding your JSON twice, making it difficult to read and deserialize the json in your structure.
You can start by organizing your structures as follows:
type Expertise struct {
Role int `json:"role"`
Experience int `json:"Experience"`
}
type CreateFilterResponse struct {
Id int `json:"id"`
Expert []Expertise `json:"expertise"`
Languages []int `json:"languages"`
}
I made a cleaner method just to exemplify and facilitate understanding. It will remove the unnecessary quotes and escapes so you can convert your string to []bytes and unmarshal your structure.
func jsonCleaner(quoted string) []byte{
quoted = strings.ReplaceAll(quoted, "\n", "")
quoted = strings.ReplaceAll(quoted, "\\", "")
quoted = strings.ReplaceAll(quoted, "]\"", "]")
quoted = strings.ReplaceAll(quoted, "\"[", "[")
dataBytes := []byte(quoted)
fmt.Println(dataBytes)
return dataBytes
}
Your json message would go from:
{"id":5,"expertise":"[{\"role\":1,\"experience\":5},{\"role\":3,\"experience\":4}]","languages":"[1,2,3]"}
To:
{"id":5,"expertise":[{"role":1,"experience":5},"role":3,"experience":4}],"languages":[1,2,3]}

How to get row value(s) back after db insert?

I am using Golang to insert data into a DB. basically my query looks like below
var cols = "(col1, col2, col3)"
var values = "($1, $2, $3)"
var query = fmt.Sprintf("INSERT INTO %s %s VALUES %s", myTable, cols, values)
res, err := db.Exec(query, thing.val1, thing.val2, thing.val3)
The only things available from res are lastInsertId and # of rows affected. But what I need is the rows affected. The reason being is that I insert data into a psql database which has an AUTOINCREMENT id column - so I want the data back with that.
For example - with Java hibernate I can do what this answer explains. I don't have to re-query the DB for the ID.
EDIT: I tried to use the lastInsertId method and got this error
LastInsertId is not supported by this driver
Assuming you just want the auto-incremented value(s) in a column called id and this is an insert with the pq driver
var cols = "(col1, col2, col3)"
var values = "($1, $2, $3)"
var query = fmt.Sprintf(
"INSERT INTO %s %s VALUES %s RETURNING id",
myTable, cols, values,
)
var id int
if err := db.QueryRow(
query,
thing.val1, thing.val2, thing.val3,
).Scan(&id); err != nil {
panic(err)
}
fmt.Println("ID: ", id)
For multiple inserts:
var cols = "(col1, col2, col3)"
var values = "($1, $2, $3),($4, $5, $6)"
var query = fmt.Sprintf(
"INSERT INTO %s %s VALUES %s RETURNING id",
myTable, cols, values,
)
var ids []int
rows, err := db.Query(
query,
thing.val1, thing.val2, thing.val3,
thing.val4, thing.val5, thing.val6,
)
if err != nil {
panic(err)
}
for rows.Next() {
var id int
if err := rows.Scan(&id); err != nil {
panic(err)
}
ids = append(ids, id)
}
fmt.Println("IDs: ", ids)
res.LastInsertId() is not supported in Postgres Driver. However, It is supported in MySQL Driver.
db.Exec() doesn't return last inserted id but db.QueryRow() does.
For better understanding you can refer this link.
Here, I added one example which might help you.
var id int
err := db.QueryRow("INSERT INTO user (name) VALUES ('John') RETURNING id").Scan(&id)
if err != nil {
...
}

How to add an element to a json field array in Postgres

I'm trying to append data to an array that belongs to a json field in postgres. While using pgAdmin I know the following query works. ~
UPDATE lesson SET data =
jsonb_set (data, '{pages, 999999}', '{"pageNum": 2, "pageType": "voc"}', True)
WHERE id = 2;
I am simply trying to get the above query to work via my rest api written in go. I am getting an error that reads "pq: invalid input syntax for type json".
my code is as follows~
_, err := db.Exec(`
UPDATE lessons SET data =
jsonb_set (data, '{pages, 999999}','{"pageNum": $1, "pageType": $2}', True)
WHERE id = $3`,
pageNum, pageType, id) // variable types are int string int
I suspect that the postgres driver isn't interpolating the the $ parameters. It will work if I use fmt.Sprinf() for the whole query but I am trying to avoid SQL injection attacks, and would like to take advantage of the built in security measures of the go sql library.
For reference my data is structured as follows~
Lessons Table
Lessons
id int
data jsonb
Go structs:
type Lesson struct {
ID int `json:"id"`
Name string `json:"name"`
Pages []Page `json:"pages"`
}
type Page struct {
PageNum int `json:"pageNum"`
PageType string `json:"pageType"`
You cannot use query parameters within a string in Postgres. Either pass the entire string to Postgres as single parameter:
str := fmt.Sprintf('{"pageNum": %d, "pageType": %q}', pageNum, pageType)
_, err := db.Exec(`
UPDATE lessons SET data =
jsonb_set (data, '{pages, 999999}', $1, True)
WHERE id = $2`,
str, id) // variable types are int string int
or use string concatenation to do it on the server side:
_, err := db.Exec(`
UPDATE lessons SET data =
jsonb_set (data, '{pages, 999999}','{"pageNum": ' || $1 || ', "pageType": ' || $2 || '}', True)
WHERE id = $3`,
pageNum, pageType, id) // variable types are int string int
The best/safest, is probably the first approach, with full JSON marshaling in your client, rather than a simple fmt.Sprintf. I leave that as an exercise for the reader.

Golang Postgresql Array

If I have a table that returns something like:
id: 1
names: {Jim, Bob, Sam}
names is a varchar array.
How do I scan that back into a []string in Go?
I'm using lib/pg
Right now I have something like
rows, err := models.Db.Query("SELECT pKey, names FROM foo")
for rows.Next() {
var pKey int
var names []string
err = rows.Scan(&pKey, &names)
}
I keep getting:
panic: sql: Scan error on column index 1: unsupported Scan, storing driver.Value type []uint8 into type *[]string
It looks like I need to use StringArray
https://godoc.org/github.com/lib/pq#StringArray
But, I think I'm too new to Go to understand exactly how to use:
func (a *StringArray) Scan(src interface{})
You are right, you can use StringArray but you don't need to call the
func (a *StringArray) Scan(src interface{})
method yourself, this will be called automatically by rows.Scan when you pass it anything that implements the Scanner interface.
So what you need to do is to convert your []string to *StringArray and pass that to rows.Scan, like so:
rows, err := models.Db.Query("SELECT pKey, names FROM foo")
for rows.Next() {
var pKey int
var names []string
err = rows.Scan(&pKey, (*pq.StringArray)(&names))
}
Long Story Short, use like this to convert pgSQL array to GO array, here 5th column is coming as a array :
var _temp3 []string
for rows.Next() {
// ScanRows scan a row into temp_tbl
err := rows.Scan(&_temp, &_temp0, &_temp1, &_temp2, pq.Array(&_temp3))
In detail :
To insert a row that contains an array value, use the pq.Array function like this:
// "ins" is the SQL insert statement
ins := "INSERT INTO posts (title, tags) VALUES ($1, $2)"
// "tags" is the list of tags, as a string slice
tags := []string{"go", "goroutines", "queues"}
// the pq.Array function is the secret sauce
_, err = db.Exec(ins, "Job Queues in Go", pq.Array(tags))
To read a Postgres array value into a Go slice, use:
func getTags(db *sql.DB, title string) (tags []string) {
// the select query, returning 1 column of array type
sel := "SELECT tags FROM posts WHERE title=$1"
// wrap the output parameter in pq.Array for receiving into it
if err := db.QueryRow(sel, title).Scan(pq.Array(&tags)); err != nil {
log.Fatal(err)
}
return
}
Note: that in lib/pq, only slices of certain Go types may be passed to pq.Array().
Another example in which varchar array in pgSQL in generated at runtime in 5th column, like :
--> predefined_allow false admin iam.create {secrets,configMap}
I converted this as,
Q := "SELECT ar.policy_name, ar.allow, ar.role_name, pro.operation_name, ARRAY_AGG(pro.resource_id) as resources FROM iam.authorization_rules ar LEFT JOIN iam.policy_rules_by_operation pro ON pro.id = ar.operation_id GROUP BY ar.policy_name, ar.allow, ar.role_name, pro.operation_name;"
tx := g.db.Raw(Q)
rows, _ := tx.Rows()
defer rows.Close()
var _temp string
var _temp0 bool
var _temp1 string
var _temp2 string
var _temp3 []string
for rows.Next() {
// ScanRows scan a row into temp_tbl
err := rows.Scan(&_temp, &_temp0, &_temp1, &_temp2, pq.Array(&_temp3))
if err != nil {
return nil, err
}
fmt.Println("Query Executed...........\n", _temp, _temp0, _temp1, _temp2, _temp3)
}
Output :
Query Executed...........
predefined_allow false admin iam.create [secrets configMap]