How to convert pgx.Rows from Query() to json array? - postgresql

I'm using github.com/jackc/pgx for work with postgreSQL.
Noq I want to convert pgx.Rows from Query() to json array.
I tried func for *sql.Rows, but it doesn't work for *pgx.Rows
func PgSqlRowsToJson(rows *pgx.Rows) []byte {
fieldDescriptions := rows.FieldDescriptions()
var columns []string
for _, col := range fieldDescriptions {
columns = append(columns, col.Name)
}
count := len(columns)
tableData := make([]map[string]interface{}, 0)
values := make([]interface{}, count)
valuePtrs := make([]interface{}, count)
for rows.Next() {
for i := 0; i < count; i++ {
valuePtrs[i] = &values[i]
}
rows.Scan(valuePtrs...)
entry := make(map[string]interface{})
for i, col := range columns {
var v interface{}
val := values[i]
b, ok := val.([]byte)
if ok {
v = string(b)
} else {
v = val
}
entry[col] = v
}
tableData = append(tableData, entry)
}
jsonData, _ := json.Marshal(tableData)
return jsonData
}
The problem is that Scan() doesn't work with interface{} and it works with explicitly defined types only.
Can you help me how to fix it?

You can use the pgx.FieldDescription's Type method to retrieve a column's expected type. Passing that to reflect.New you can then allocate a pointer to a value of that type, and with these newly allocated values you can then make a slice of non-nil interface{}s whose underlying values have the expected type.
For example:
func PgSqlRowsToJson(rows *pgx.Rows) []byte {
fieldDescriptions := rows.FieldDescriptions()
var columns []string
for _, col := range fieldDescriptions {
columns = append(columns, col.Name)
}
count := len(columns)
tableData := make([]map[string]interface{}, 0)
valuePtrs := make([]interface{}, count)
for rows.Next() {
for i := 0; i < count; i++ {
valuePtrs[i] = reflect.New(fieldDescriptions[i].Type()).Interface() // allocate pointer to type
}
rows.Scan(valuePtrs...)
entry := make(map[string]interface{})
for i, col := range columns {
var v interface{}
val := reflect.ValueOf(valuePtrs[i]).Elem().Interface() // dereference pointer
b, ok := val.([]byte)
if ok {
v = string(b)
} else {
v = val
}
entry[col] = v
}
tableData = append(tableData, entry)
}
jsonData, _ := json.Marshal(tableData)
return jsonData
}

Related

How to find maximum value of a field in mongodb golang?

This is my code. I always get maximum value of 999, even when there are more blocks (e.g 3000 blocks).
This is what the document(s) look like.
func GetLatestBlockFromMongoDB() int{
if contains(ReturnBlocksExists(), "blocks") == true {
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
var blockheights []models.Block
defer cancel()
options := options.Find().SetProjection(bson.M{"block.header.blockNumber": 1})
options.SetSort(bson.D{{"block.header.blockNumber", -1}})
options.SetLimit(1)
results, err := blocksCollections.Find(context.TODO(), bson.D{}, options)
if err != nil {
fmt.Println(err)
}
//reading from the db in an optimal way
defer results.Close(ctx)
for results.Next(ctx) {
var singleBlock models.Block
if err = results.Decode(&singleBlock); err != nil {
fmt.Println(err)
}
blockheights = append(blockheights, singleBlock)
}
fmt.Println("%v", blockheights[0].Block.Header.BlockNumber)
if len(strings.TrimSpace(blockheights[0].Block.Header.BlockNumber)) > 0{
i, err := strconv.Atoi(blockheights[0].Block.Header.BlockNumber)
if err != nil {
glog.Fatalf("%v", err)
}
return i
} else {
return 0
}
} else {
return 0
}
}
How can i get the maximum value of blockNumber? I think the problem might be because blockNumber is a string but not an integer. I'm not sure. I did set sort and also limit 1 so it should normally work.
Yes, you are right, from the image, I can see that the blockNumber field is a string, so you are not comparing integers, you are comparing string, where "999" is greater than "3000":
For example:
package main
import (
"fmt"
)
func findMax(a []string) string {
m := a[0]
for _, value := range a {
if value > m {
m = value
}
}
return m
}
func main() {
blocks := []string{"999", "2000", "3000"}
ma := findMax(blocks)
fmt.Println(ma)
}
$ go run .
999
Check out this question too.

pgx preparedStatement execution does not return results (unlike fmt.Sprintf)

For some reason, the prepareStatement I built in go using pgx is not returning any results/rows (see scenario1). In scenario 2, if I build the same query using Sprintf and $, the results are also not returned.
But, as shown in scenario 3, if I build the query with a Sprintf with %s placeholders and execute it, it will return results/rows.
I would like to build/execute the query using the "logic" of scenario 1 (with prepared statement instead with Sprintf). Any ideas?
I tested the query directly on the DB (with pgadmin) and it has data/results.
Assumption:
I suspect it might have something to do with:
in the WHERE condition: the uuid data type or with the string in the IN condition
the query result that returns a uuid and a jsonb
Details:
The DB is postgres and the table t has, among others, the field types:
t.str1 and t.properties are string
t.my_uuid is uuid.UUID (https://github.com/satori/go.uuid)
t.C is jsonb
the pgx version I'm using is: github.com/jackc/pgx v3.6.2+incompatible
scenario 1
func prepareQuery(cp *pgx.ConnPool) (string, error) {
prep := `SELECT DISTINCT
A,B,C
FROM table t
WHERE t.str1 = $1
AND t.properties IN ($2)
AND t.my_uuid = $3`
_, err := cp.Prepare("queryPrep", prep)
if err != nil {
return "some error", err
}
return prep, nil
}
func x() {
//...
q, err := prepareQuery(cPool)
if err != nil {
return fmt.Errorf("failed to prepare query %s: %w", q, err)
}
rows, err := cPool.Query("queryPrep", "DumyStr", "'prop1','prop2'", myUUID) //myUUID is a valid satori uuid
if err != nil {
//return error...
}
for rows.Next() { // zero rows
//Scan...
}
//...
}
scenario 2
func makeQueryStr()string {
return fmt.Sprintf(`SELECT DISTINCT
A,B,C
FROM table t
WHERE t.str1 = $1
AND t.properties IN ($2)
AND t.my_uuid = $3`)
}
func x() {
//...
myQuery := makeQueryStr()
rows, err := cPool.Query(myQuery, "DumyStr", "'prop1','prop2'", myUUID) //myUUID is a valid satori uuid
if err != nil {
//return error...
}
for rows.Next() { // zero rows
//Scan...
}
//...
}
scenario 3
func makeQueryStr(par1 string, par2 string, par3 uuid.UUID)string {
return fmt.Sprintf(`SELECT DISTINCT
A,B,C
FROM table t
WHERE t.str1 = '%s'
AND t.properties IN (%s)
AND t.my_uuid = '%s'`, par1, par2, par3)
}
func x() {
//...
myQuery := makeQueryStr("DumyStr", "'prop1','prop2'", myUUID) //myUUID is a valid satori uuid
rows, err := cPool.Query(myQuery)
if err != nil {
//return error...
}
for rows.Next() { // 10 rows
//Scan...
}
//...
}

Converting a struct to a bson document

I have a struct that looks like this:
type User struct {
UserID string `bson:"user_id"`
Name string `bson:"name"`
Address string `bson:"address"`
}
I am using mongo's UpdateOne to only update specific fields in a document. Doing this allows me to only update the name where the user_id is 1234:
filter := bson.D{{"user_id", "1234"}}
update := bson.D{{"$set",
bson.D{
{"name", "john"},
},
}}
myCollection.UpdateOne(context.Background(), filter, update)
However, I want to use a struct instead to replace whatever is in the update variable.
So I want to be able use,
update:= User{Name: "john"}
How do I convert this to a bson document like in the working example?
Most responses always use bson.Marshal() followed by bson.Unmarshal(), but this form is slower.
Imagine that you are on a server with thousands of accesses to the database per second. Your code can slow down server operation depending on the quality of the code.
For this reason, I always do the benchmark tests before choosing which method to use.
An example is shown below:
package bsontest
import (
"errors"
"fmt"
"go.mongodb.org/mongo-driver/bson"
"reflect"
"testing"
)
type User struct {
UniqueID string `bson:"_id"`
Name string `bson:"name"`
Password string `bson:"password"`
Email string `bson:"email"`
Age int `bson:"age"`
Admin bool `bson:"admin"`
}
func (e *User) FromBson(data bson.M) (err error) {
var tagValue string
element := reflect.ValueOf(e).Elem()
for i := 0; i < element.NumField(); i += 1 {
typeField := element.Type().Field(i)
tag := typeField.Tag
tagValue = tag.Get("bson")
if tagValue == "-" {
continue
}
switch element.Field(i).Kind() {
case reflect.String:
switch data[tagValue].(type) {
case string:
element.Field(i).SetString(data[tagValue].(string))
default:
err = errors.New(tagValue+" must be a string")
return
}
case reflect.Bool:
switch data[tagValue].(type) {
case bool:
element.Field(i).SetBool(data[tagValue].(bool))
default:
err = errors.New(tagValue+" must be a boolean")
return
}
case reflect.Int:
switch data[tagValue].(type) {
case int:
element.Field(i).SetInt(int64(data[tagValue].(int)))
case int64:
element.Field(i).SetInt(data[tagValue].(int64))
default:
err = errors.New(tagValue+" must be a integer")
return
}
}
}
return
}
func (e *User) ToBson() (data bson.M) {
var tagValue string
data = bson.M{}
element := reflect.ValueOf(e).Elem()
for i := 0; i < element.NumField(); i += 1 {
typeField := element.Type().Field(i)
tag := typeField.Tag
tagValue = tag.Get("bson")
if tagValue == "-" {
continue
}
switch element.Field(i).Kind() {
case reflect.String:
value := element.Field(i).String()
data[tagValue] = value
case reflect.Bool:
value := element.Field(i).Bool()
data[tagValue] = value
case reflect.Int:
value := element.Field(i).Int()
data[tagValue] = value
}
}
return
}
var user User
func init() {
user = User{
UniqueID: "12345-67890-ABCDE-FGHIJKL",
Name: "Fulano da Silva Sauro",
Password: "pangea",
Email: "sauro#pangea.com",
Admin: true,
}
}
func ExampleUser_ToBson() {
var err error
var userAsBSon = user.ToBson()
fmt.Printf("%+v\n", userAsBSon)
user = User{}
fmt.Printf("%+v\n", user)
err = user.FromBson(userAsBSon)
if err != nil {
panic(err)
}
fmt.Printf("%+v\n", user)
// Output:
// map[_id:12345-67890-ABCDE-FGHIJKL admin:true age:0 email:sauro#pangea.com name:Fulano da Silva Sauro password:pangea]
// {UniqueID: Name: Password: Email: Age:0 Admin:false}
// {UniqueID:12345-67890-ABCDE-FGHIJKL Name:Fulano da Silva Sauro Password:pangea Email:sauro#pangea.com Age:0 Admin:true}
}
func Benchmark_UsingMarshalAndUnmarshal(b *testing.B) {
var err error
var bsonAsByte []byte
var bsonData bson.M
for i := 0; i < b.N; i++ {
bsonAsByte, err = bson.Marshal(&user)
if err != nil {
panic(err)
}
err = bson.Unmarshal(bsonAsByte, &bsonData)
if err != nil {
panic(err)
}
}
}
func Benchmark_UsingUnmarshalAndMarshal(b *testing.B) {
var err error
var bsonAsByte []byte
var bsonData = bson.M{
"_id":"12345-67890-ABCDE-FGHIJKL",
"admin":true,
"age":0,
"email":"sauro#pangea.com",
"name":"Fulano da Silva Sauro",
"password":"pangea",
}
for i := 0; i < b.N; i++ {
bsonAsByte, err = bson.Marshal(&bsonData)
if err != nil {
panic(err)
}
err = bson.Unmarshal(bsonAsByte, &user)
if err != nil {
panic(err)
}
}
}
func Benchmark_UsingToBson(b *testing.B) {
var err error
var bsonData = bson.M{
"_id":"12345-67890-ABCDE-FGHIJKL",
"admin":true,
"age":0,
"email":"sauro#pangea.com",
"name":"Fulano da Silva Sauro",
"password":"pangea",
}
for i := 0; i < b.N; i++ {
err = user.FromBson(bsonData)
if err != nil {
panic(err)
}
}
}
func Benchmark_UsingFromBson(b *testing.B) {
var bsonData bson.M
for i := 0; i < b.N; i++ {
bsonData = user.ToBson()
}
_ = bsonData
}
Times:
Benchmark_UsingMarshalAndUnmarshal
Benchmark_UsingMarshalAndUnmarshal-8 398800 2996 ns/op
Benchmark_UsingToBson
Benchmark_UsingToBson-8 1789176 643.0 ns/op
Benchmark_UsingFromBson
Benchmark_UsingFromBson-8 2128242 539.6 ns/op
Benchmark_UsingUnmarshalAndMarshal
Benchmark_UsingUnmarshalAndMarshal-8 474501 2524 ns/op

How can I ignore an embedded struct field while inserting database?

I have 2 database table;
A has 3 columns and they are X, Y, Z
B has 2 columns and they are X, W
My Go structs are like this;
type Base struct {
X int
Y int
}
type A struct {
Base
Z int
}
type B struct {
Base
W int
}
And I initialize my structs like this;
a := A{Base: Base{X: 1, Y:2}, Z: 3}
b := B{Base: Base{X: 1}, W: 4}
When I want to insert these to database using gorm.io ORM, "a" is inserted without any problem but "b" can't be inserted because postgresql gives me an error something like
pq: column "y" of relation "B" does not exist
How can I insert "b" to database without creating another base model that doesn't have "Y" field?
When you assigning struct to another struct, and you create instance of struct, all struct fields has been filled with they default data-type value.
for example: int default value is 0.
So you have 2 solution for this question.
create two different struct(without Base struct), just A and B. like this: (maybe you know this solution.).
type A struct {
X int
Y int
Z int
}
type B struct {
X int
W int
}
use struct tags to prevent from inserting with gorm.
Note: I didn`t test this.
type Base struct {
X int
Y int `json:"y,omitempty"`
}
type A struct {
Base
Z int
}
type B struct {
Base
W int
}
I think you are not declaring the models correctly as documentation 'Base' should have tag embedded inside both A and B then it is ok. .. . for your better understanding I am posting your code with modification. Please be noted that I have tested in my machine it works as charm. .
here is the model declaration.
type Base struct {
X int `json:"x"`
Y int `json:"y"`
}
type A struct {
Base `gorm:"embedded"`
Z int `json:"z"`
}
type B struct {
Base `gorm:"embedded"`
W int `json:"w"`
}
here is the main function for your understanding. . .
func main() {
fmt.Println("vim-go")
db, err := gorm.Open("postgres", "host=localhost port=5432 user=postgres
dbname=testd password=postgres sslmode=disable")
if err != nil {
log.Panic("error occured", err)
}
db.AutoMigrate(&A{}, &B{})
a := &A{Base: Base{X: 1, Y: 2}, Z: 3}
b := &B{Base: Base{X: 1}, W: 3}
if err := db.Create(a).Error; err != nil {
log.Println(err)
}
if err := db.Create(b).Error; err != nil {
log.Println(err)
}
fmt.Println("everything is ok")
defer db.Close()
}
Please be sure to check the documentation of model declaration and the tags. . .
gorm model declaration
Note: Y field will be there in B table but with zero value. . .
package main
import (
"encoding/json"
"fmt"
"reflect"
)
type SearchResult struct {
OrderNumber string `json:"orderNumber"`
Qunatity interface{} `json:"qty"`
Price string `json:"price"`
OrderType interface{} `json:"type"`
ItemQty string `json:"itemQty"`
}
type Or []SearchResult
func fieldSet(fields ...string) map[string]bool {
set := make(map[string]bool, len(fields))
for _, s := range fields {
set[s] = true
}
return set
}
func (s *SearchResult) SelectFields(fields ...string) map[string]interface{} {
fs := fieldSet(fields...)
rt, rv := reflect.TypeOf(*s), reflect.ValueOf(*s)
out := make(map[string]interface{}, rt.NumField())
for i := 0; i < rt.NumField(); i++ {
field := rt.Field(i)
jsonKey := field.Tag.Get("json")
if fs[jsonKey] {
out[jsonKey] = rv.Field(i).Interface()
}
}
return out
}
func main() {
result := &SearchResult{
Date: "to be honest you should probably use a time.Time field here,
just sayin",
Industry: "rocketships",
IdCity: "interface{} is kinda inspecific, but this is the idcity field",
City: "New York Fuckin' City",
}
b, err := json.MarshalIndent(result.SelectFields("orderNumber", "qty"), "", " ")
if err != nil {
panic(err.Error())
}
var or Or
fmt.Print(string(b))
or=append(or,*result)
or=append(or,*result)
for i := 0; i < len(or); i++ {
c, err := json.MarshalIndent(or[i].SelectFields("idCity", "city", "company"),
"", " ")
if err != nil {
panic(err.Error())
}
fmt.Print(string(c))
}
}
The one way of doing this using omitempty fields in struct and other way is to traverse through struct fields which is expensive.
In case of performance doesn`t matter for you then you can take reference of above code snippet.

Convert type from struct table to base.FixedDataGrid in GO

i'm having a trouble converting my struct table to fixedDataGrid, because i need my data to be a fixedDataGrid so that i can use machine learning methods from GoLearn lib.
My struct is like this:
type dataStruct struct{
Sepal_length string
Sepal_width string
Petal_length string
Petal_width string
Species string
}
So when i get my data from my mongo db, i get them like this:
var results []dataStruct
err := col.Find(nil).All(&results)
Is there a way to convert my "results" from []dataStruct type to base.FixedDataGrid ??
CreateModel function:
func CreateModel(c echo.Context) error {
fmt.Println("====> Entry CreateModel function");
//var results []dataStruct
var Success bool = false
Db := db.MgoDb{}
Db.Init()
defer Db.Close()
col := Db.C(db.TrainingDataCollection)
var results dataStruct
if err := col.Find(nil).All(results); err != nil {
fmt.Println("ERROR WHILE GETTING THE TRAINING DATA")
} else {
//fmt.Println("Results All: ", results)
Success = true
}
fmt.Println("=============",results)
//Initialises a new KNN classifier
cls := knn.NewKnnClassifier("euclidean", "linear", 2)
//Do a training-test split
trainData, testData := base.InstancesTrainTestSplit(results, 0.55)
cls.Fit(trainData)
//Calculates the Euclidean distance and returns the most popular label
predictions, err := cls.Predict(testData)
if err != nil {
panic(err)
}
fmt.Println(predictions)
// Prints precision/recall metrics
confusionMat, err := evaluation.GetConfusionMatrix(testData, predictions)
if err != nil {
panic(fmt.Sprintf("Unable to get confusion matrix: %s", err.Error()))
}
fmt.Println(evaluation.GetSummary(confusionMat))
return c.JSON(http.StatusOK, Success)
}
Thank you in advance for your help !
Here is how i solved the issue: Actually there is a function InstancesFromMat64(row int, col int, matrix) than creates instances from a float64 matrix, and this is what i used:
func CreateModel(c echo.Context) error {
fmt.Println("====> Entry CreateModel function");
var Success bool = false
Db := db.MgoDb{}
Db.Init()
defer Db.Close()
col := Db.C(db.TrainingDataCollection)
var results dataStruct
if err := col.Find(nil).All(&results); err != nil {
fmt.Println("ERROR WHILE GETTING THE TRAINING DATA")
} else {
Success = true
}
Data := make([]float64, len(results*nbAttrs)
/**** Filling the Data var with my dataset data *****/
mat := mat64.NewDense(row,nbAttrs,Data)
inst := base.InstancesFromMat64(row,nbAttrs,mat)
//Selecting the class attribute for our instance
attrs := inst.AllAttributes()
inst.AddClassAttribute(attrs[4])
//Initialise a new KNN classifier
cls := knn.NewKnnClassifier("manhattan","linear",3)
//Training-tessting split
trainData, testData := base.InstancesTrainTestSplit(inst,0.7)
/******* Continue the Model creation ******/
I'll be glad if my answer helps someone.
Thanks a lot #mkopriva for your help !
base.FixedDataGrid is an interface, so what you need to do is to implement that interface, that is, implement all of its methods, on the type you want to use as FixedDataGrid.
Since you want to use []dataStruct, a slice of dataStructs, which is an unnamed type, as FixedDataGrid you will have to declare a new type to be able to add methods to it because you can add methods only to named types. For example something like this:
type dataStructList []dataStruct
Now, if you take a look at the documentation, you can see that the FixedDataGrid interface declares two methods RowString and Size but also embeds another interface, the base.DataGrid interface, which means you need to implement the methods declared by DataGrid as well. So, given your new dataStructList type, you can do something like this:
func (l dataStructList) RowString(int) string { /* ... */ }
func (l dataStructList) Size() (int, int) { /* ... */ }
func (l dataStructList) GetAttribute(base.Attribute) (base.AttributeSpec, error) { /* ... */ }
func (l dataStructList) AllAttributes() []base.Attribute { /* ... */ }
func (l dataStructList) AddClassAttribute(base.Attribute) error { /* ... */ }
func (l dataStructList) RemoveClassAttribute(base.Attribute) error { /* ... */ }
func (l dataStructList) AllClassAttributes() []base.Attribute { /* ... */ }
func (l dataStructList) Get(base.AttributeSpec, int) []byte { /* ... */ }
func (l dataStructList) MapOverRows([]base.AttributeSpec, func([][]byte, int) (bool, error)) error { /* ... */ }
After you've implemented the /* ... */ parts you can then start using dataStructList as a FixedDataGrid, so something like this:
var results []dataStruct
err := col.Find(nil).All(&results)
fdg := dataStructList(results) // you can use fdg as FixedDataGrid
Or
var results dataStructList // you can use results as FixedDataGrid
err := col.Find(nil).All(&results)
Update:
After you've implemented all of those methods on the dataStructList all you need is the type of the results variable inside your function:
func CreateModel(c echo.Context) error {
fmt.Println("====> Entry CreateModel function")
//var results []dataStruct
var Success bool = false
Db := db.MgoDb{}
Db.Init()
defer Db.Close()
col := Db.C(db.TrainingDataCollection)
var results dataStructList // <--- use the type that implements the interface
if err := col.Find(nil).All(&results); err != nil { // <-- pass a pointer to results
fmt.Println("ERROR WHILE GETTING THE TRAINING DATA")
} else {
//fmt.Println("Results All: ", results)
Success = true
}
fmt.Println("=============", results)
//Initialises a new KNN classifier
cls := knn.NewKnnClassifier("euclidean", "linear", 2)
//Do a training-test split
trainData, testData := base.InstancesTrainTestSplit(results, 0.55) // <-- this will work because results if of type dataStructList, which implements the base.FixedDataGrid interface.
cls.Fit(trainData)
//Calculates the Euclidean distance and returns the most popular label
predictions, err := cls.Predict(testData)
if err != nil {
panic(err)
}
fmt.Println(predictions)
// Prints precision/recall metrics
confusionMat, err := evaluation.GetConfusionMatrix(testData, predictions)
if err != nil {
panic(fmt.Sprintf("Unable to get confusion matrix: %s", err.Error()))
}
fmt.Println(evaluation.GetSummary(confusionMat))
return c.JSON(http.StatusOK, Success)
}