How can I ignore an embedded struct field while inserting database? - postgresql

I have 2 database table;
A has 3 columns and they are X, Y, Z
B has 2 columns and they are X, W
My Go structs are like this;
type Base struct {
X int
Y int
}
type A struct {
Base
Z int
}
type B struct {
Base
W int
}
And I initialize my structs like this;
a := A{Base: Base{X: 1, Y:2}, Z: 3}
b := B{Base: Base{X: 1}, W: 4}
When I want to insert these to database using gorm.io ORM, "a" is inserted without any problem but "b" can't be inserted because postgresql gives me an error something like
pq: column "y" of relation "B" does not exist
How can I insert "b" to database without creating another base model that doesn't have "Y" field?

When you assigning struct to another struct, and you create instance of struct, all struct fields has been filled with they default data-type value.
for example: int default value is 0.
So you have 2 solution for this question.
create two different struct(without Base struct), just A and B. like this: (maybe you know this solution.).
type A struct {
X int
Y int
Z int
}
type B struct {
X int
W int
}
use struct tags to prevent from inserting with gorm.
Note: I didn`t test this.
type Base struct {
X int
Y int `json:"y,omitempty"`
}
type A struct {
Base
Z int
}
type B struct {
Base
W int
}

I think you are not declaring the models correctly as documentation 'Base' should have tag embedded inside both A and B then it is ok. .. . for your better understanding I am posting your code with modification. Please be noted that I have tested in my machine it works as charm. .
here is the model declaration.
type Base struct {
X int `json:"x"`
Y int `json:"y"`
}
type A struct {
Base `gorm:"embedded"`
Z int `json:"z"`
}
type B struct {
Base `gorm:"embedded"`
W int `json:"w"`
}
here is the main function for your understanding. . .
func main() {
fmt.Println("vim-go")
db, err := gorm.Open("postgres", "host=localhost port=5432 user=postgres
dbname=testd password=postgres sslmode=disable")
if err != nil {
log.Panic("error occured", err)
}
db.AutoMigrate(&A{}, &B{})
a := &A{Base: Base{X: 1, Y: 2}, Z: 3}
b := &B{Base: Base{X: 1}, W: 3}
if err := db.Create(a).Error; err != nil {
log.Println(err)
}
if err := db.Create(b).Error; err != nil {
log.Println(err)
}
fmt.Println("everything is ok")
defer db.Close()
}
Please be sure to check the documentation of model declaration and the tags. . .
gorm model declaration
Note: Y field will be there in B table but with zero value. . .

package main
import (
"encoding/json"
"fmt"
"reflect"
)
type SearchResult struct {
OrderNumber string `json:"orderNumber"`
Qunatity interface{} `json:"qty"`
Price string `json:"price"`
OrderType interface{} `json:"type"`
ItemQty string `json:"itemQty"`
}
type Or []SearchResult
func fieldSet(fields ...string) map[string]bool {
set := make(map[string]bool, len(fields))
for _, s := range fields {
set[s] = true
}
return set
}
func (s *SearchResult) SelectFields(fields ...string) map[string]interface{} {
fs := fieldSet(fields...)
rt, rv := reflect.TypeOf(*s), reflect.ValueOf(*s)
out := make(map[string]interface{}, rt.NumField())
for i := 0; i < rt.NumField(); i++ {
field := rt.Field(i)
jsonKey := field.Tag.Get("json")
if fs[jsonKey] {
out[jsonKey] = rv.Field(i).Interface()
}
}
return out
}
func main() {
result := &SearchResult{
Date: "to be honest you should probably use a time.Time field here,
just sayin",
Industry: "rocketships",
IdCity: "interface{} is kinda inspecific, but this is the idcity field",
City: "New York Fuckin' City",
}
b, err := json.MarshalIndent(result.SelectFields("orderNumber", "qty"), "", " ")
if err != nil {
panic(err.Error())
}
var or Or
fmt.Print(string(b))
or=append(or,*result)
or=append(or,*result)
for i := 0; i < len(or); i++ {
c, err := json.MarshalIndent(or[i].SelectFields("idCity", "city", "company"),
"", " ")
if err != nil {
panic(err.Error())
}
fmt.Print(string(c))
}
}
The one way of doing this using omitempty fields in struct and other way is to traverse through struct fields which is expensive.
In case of performance doesn`t matter for you then you can take reference of above code snippet.

Related

How to insert a map[string] into a jsonb field

I have a struct Huygens:
type Huygens struct {
Worlds map[string]World `json:"worlds" sql:"type:JSONB"`
}
type World struct {
Diameter int
Name string
}
When I try to insert into a Postgres DB using GORM I get:
sql: converting argument $1 type: unsupported type map[string]huygens.world, a map
To insert I simply use db.Create(&World{})
Does anyone know how to insert that column as a JSONB column and avoid that error?
Thanks
You need to specify that Huygens is a gorm.Model and dump your World struct to a []byte type.
import (
"encoding/json"
"fmt"
"gorm.io/datatypes"
)
type Huygens struct {
gorm.Model
Worlds datatypes.JSON
}
type World struct {
Diameter int
Name string
}
world := World{Diameter: 3, Name: Earth}
b, err := json.Marshal(world)
if err != nil {
fmt.Println("error:", err)
}
db.Create(&Huygens{Worlds: datatypes.JSON(b)})
You can see more here.
Another way, which makes the model more usable, is to create a type that implements Scan/Value, and in those Unmarshal from/Marshal to JSON:
type Huygens struct {
Worlds WorldsMap `json:"worlds"`
}
type World struct {
Diameter int
Name string
}
type WorldsMap map[string]World
func (WorldsMap) GormDataType() string {
return "JSONB"
}
func (w *WorldsMap) Scan(value interface{}) error {
var bytes []byte
switch v := value.(type) {
case []byte:
bytes = v
case string:
bytes = []byte(v)
default:
return errors.New(fmt.Sprint("Failed to unmarshal JSONB value:", value))
}
err := json.Unmarshal(bytes, w)
return err
}
func (w WorldsMap) Value() (driver.Value, error) {
bytes, err := json.Marshal(s)
return string(bytes), err
}
Then you can use the Worlds field in your code like a map, as opposed to an awkward byte slice.

go scan Postgres array_agg

I have a one-to-many relationship in postgres (Event has many EventUser), and would like to scan and store into a struct Event.
// EventUser struct
type EventUser struct {
ID int64
CheckedIn bool
PaidAmount float32
}
// Event struct
type Event struct {
ID int64 `json:"id"`
Name string `json:"name"`
StartTime string `json:"startTime"`
EventUsers string
}
Here is the query:
SELECT events.id, events.name, events."startTime", e."eventUsers" as "eventUsers"
FROM "Events" as events
LEFT JOIN (
SELECT events.id as id, array_to_json(array_agg(eu.*)) as "eventUsers"
FROM "EventUsers" as eu
JOIN "Events" AS "events" ON events.id = eu."eventId"
WHERE eu.status = 'RESERVED'
GROUP BY events.id
) AS e USING (id)
WHERE events.status = 'COMPLETED'
The query returns this:
{
id: 2,
name: "2 Events are 48 days from now",
startTime: 1590471343345,
eventUsers: [
{
id: 2,
checkedIn: false,
paidAmount: 8
},
{
id: 3,
checkedIn: false,
paidAmount: 8,
},
],
};
This is what I am trying to do, which directly scan each item under eventUsers into struct, and store in the Event's struct.
got := []Event{}
for rows.Next() {
var r Event
err = rows.Scan(&r.ID, &r.Name, &r.StartTime, &r.EventUsers)
if err != nil {
panic(err)
}
}
I think I could achieve this by storing the array as a string and then unmarshal it.
What I want is something similar to sql.NullString.
You could define a slice type that implements the sql.Scanner interface. Note that when you pass an instance of a type that implements Scanner to a (*sql.Rows).Scan or (*sql.Row).Scan call, the impelmenter's Scan method will be invoked automatically.
type EventUserList []*EventUser
func (list *EventUserList) Scan(src interface{}) error {
if data, ok := src.([]byte); ok && len(data) > 0 {
if err := json.Unmarshal(data, list); err != nil {
return err
}
}
return nil
}
Then, assuming that the e."eventUsers" in the select query is a json array, you could use it like this:
// EventUser struct
type EventUser struct {
ID int64
CheckedIn bool
PaidAmount float32
}
// Event struct
type Event struct {
ID int64 `json:"id"`
Name string `json:"name"`
StartTime string `json:"startTime"`
EventUsers EventUserList `json:"eventUsers"`
}
// ...
var events []*Event
for rows.Next() {
e := new(Event)
if err := rows.Scan(&e.ID, &e.Name, &e.StartTime, &e.EventUsers); err != nil {
return err
}
events = append(events, e)
}

How to marshal/unmarshal bson array with polymorphic struct with mongo-go-driver

I'm struggling to marshal/unmarshal bson array with polymorphic struct with mongo-go-driver。I plan to save a struct discriminator into the marshaled data, and write a custom UnmarshalBSONValue function to decode it according to the struct discriminator. But I don't know how to do it correctly.
package polymorphism
import (
"fmt"
"testing"
"code.byted.org/gopkg/pkg/testing/assert"
"go.mongodb.org/mongo-driver/bson"
"go.mongodb.org/mongo-driver/bson/bsontype"
)
type INode interface {
GetName() string
}
type TypedNode struct {
NodeClass string
}
type Node struct {
TypedNode `bson:"inline"`
Name string
Children INodeList
}
func (n *Node) GetName() string {
return n.Name
}
type INodeList []INode
func (l *INodeList) UnmarshalBSONValue(t bsontype.Type, data []byte) error {
fmt.Println("INodeList.UnmarshalBSONValue")
var arr []bson.Raw // 1. First, try to decode data as []bson.Raw
err := bson.Unmarshal(data, &arr) // error: cannot decode document into []bson.Raw
if err != nil {
fmt.Println(err)
return err
}
for _, item := range arr { // 2. Then, try to decode each bson.Raw as concrete Node according to `nodeclass`
class := item.Lookup("nodeclass").StringValue()
fmt.Printf("class: %v\n", class)
if class == "SubNode1" {
bin, err := bson.Marshal(item)
if err != nil {
return err
}
var sub1 SubNode1
err = bson.Unmarshal(bin, &sub1)
if err != nil {
return err
}
*l = append(*l, &sub1)
} else if class == "SubNode2" {
//...
}
}
return nil
}
type SubNode1 struct {
*Node `bson:"inline"`
FirstName string
LastName string
}
type SubNode2 struct {
*Node `bson:"inline"`
Extra string
}
With code above, I'm trying to decode INodeList data as []bson.Raw, then decode each bson.Raw as a concrete Node according to nodeclass. But it report error:
cannot decode document into []bson.Raw
at line
err := bson.Unmarshal(data, &arr).
So, how to do it correctly?
You need to pass a bson.Raw’s pointer to bson.Unmarshal(data, &arr) and then slice its value to an array of raw values like:
func (l *INodeList) UnmarshalBSONValue(t bsontype.Type, data []byte) error {
fmt.Println("INodeList.UnmarshalBSONValue")
var raw bson.Raw // 1. First, try to decode data as bson.Raw
err := bson.Unmarshal(data, &raw)
if err != nil {
fmt.Println(err)
return err
}
// Slice the raw document to an array of valid raw values
rawNodes, err := raw.Values()
if err != nil {
return err
}
// 2. Then, try to decode each bson.Raw as concrete Node according to `nodeclass`
for _, rawNode := range rawNodes {
// Convert the raw node to a raw document in order to access its "nodeclass" field
d, ok := rawNode.DocumentOK()
if !ok {
return fmt.Errorf("raw node can't be converted to doc")
}
class := d.Lookup("nodeclass").StringValue()
// Decode the node's raw doc to the corresponding struct
var node INode
switch class {
case "SubNode1":
node = &SubNode1{}
case "SubNode2":
node = &SubNode2{}
//...
default:
// ...
}
bson.Unmarshal(d, node)
*l = append(*l, node)
}
return nil
}
Note that Note.Children must be a INodeList's pointer, and the inline fields must be a struct or a map (not a pointer):
type Node struct {
TypedNode `bson:",inline"`
Name string
Children *INodeList
}
type SubNode1 struct {
Node `bson:",inline"`
FirstName string
LastName string
}
type SubNode2 struct {
Node `bson:",inline"`
Extra string
}

How to convert pgx.Rows from Query() to json array?

I'm using github.com/jackc/pgx for work with postgreSQL.
Noq I want to convert pgx.Rows from Query() to json array.
I tried func for *sql.Rows, but it doesn't work for *pgx.Rows
func PgSqlRowsToJson(rows *pgx.Rows) []byte {
fieldDescriptions := rows.FieldDescriptions()
var columns []string
for _, col := range fieldDescriptions {
columns = append(columns, col.Name)
}
count := len(columns)
tableData := make([]map[string]interface{}, 0)
values := make([]interface{}, count)
valuePtrs := make([]interface{}, count)
for rows.Next() {
for i := 0; i < count; i++ {
valuePtrs[i] = &values[i]
}
rows.Scan(valuePtrs...)
entry := make(map[string]interface{})
for i, col := range columns {
var v interface{}
val := values[i]
b, ok := val.([]byte)
if ok {
v = string(b)
} else {
v = val
}
entry[col] = v
}
tableData = append(tableData, entry)
}
jsonData, _ := json.Marshal(tableData)
return jsonData
}
The problem is that Scan() doesn't work with interface{} and it works with explicitly defined types only.
Can you help me how to fix it?
You can use the pgx.FieldDescription's Type method to retrieve a column's expected type. Passing that to reflect.New you can then allocate a pointer to a value of that type, and with these newly allocated values you can then make a slice of non-nil interface{}s whose underlying values have the expected type.
For example:
func PgSqlRowsToJson(rows *pgx.Rows) []byte {
fieldDescriptions := rows.FieldDescriptions()
var columns []string
for _, col := range fieldDescriptions {
columns = append(columns, col.Name)
}
count := len(columns)
tableData := make([]map[string]interface{}, 0)
valuePtrs := make([]interface{}, count)
for rows.Next() {
for i := 0; i < count; i++ {
valuePtrs[i] = reflect.New(fieldDescriptions[i].Type()).Interface() // allocate pointer to type
}
rows.Scan(valuePtrs...)
entry := make(map[string]interface{})
for i, col := range columns {
var v interface{}
val := reflect.ValueOf(valuePtrs[i]).Elem().Interface() // dereference pointer
b, ok := val.([]byte)
if ok {
v = string(b)
} else {
v = val
}
entry[col] = v
}
tableData = append(tableData, entry)
}
jsonData, _ := json.Marshal(tableData)
return jsonData
}

Convert type from struct table to base.FixedDataGrid in GO

i'm having a trouble converting my struct table to fixedDataGrid, because i need my data to be a fixedDataGrid so that i can use machine learning methods from GoLearn lib.
My struct is like this:
type dataStruct struct{
Sepal_length string
Sepal_width string
Petal_length string
Petal_width string
Species string
}
So when i get my data from my mongo db, i get them like this:
var results []dataStruct
err := col.Find(nil).All(&results)
Is there a way to convert my "results" from []dataStruct type to base.FixedDataGrid ??
CreateModel function:
func CreateModel(c echo.Context) error {
fmt.Println("====> Entry CreateModel function");
//var results []dataStruct
var Success bool = false
Db := db.MgoDb{}
Db.Init()
defer Db.Close()
col := Db.C(db.TrainingDataCollection)
var results dataStruct
if err := col.Find(nil).All(results); err != nil {
fmt.Println("ERROR WHILE GETTING THE TRAINING DATA")
} else {
//fmt.Println("Results All: ", results)
Success = true
}
fmt.Println("=============",results)
//Initialises a new KNN classifier
cls := knn.NewKnnClassifier("euclidean", "linear", 2)
//Do a training-test split
trainData, testData := base.InstancesTrainTestSplit(results, 0.55)
cls.Fit(trainData)
//Calculates the Euclidean distance and returns the most popular label
predictions, err := cls.Predict(testData)
if err != nil {
panic(err)
}
fmt.Println(predictions)
// Prints precision/recall metrics
confusionMat, err := evaluation.GetConfusionMatrix(testData, predictions)
if err != nil {
panic(fmt.Sprintf("Unable to get confusion matrix: %s", err.Error()))
}
fmt.Println(evaluation.GetSummary(confusionMat))
return c.JSON(http.StatusOK, Success)
}
Thank you in advance for your help !
Here is how i solved the issue: Actually there is a function InstancesFromMat64(row int, col int, matrix) than creates instances from a float64 matrix, and this is what i used:
func CreateModel(c echo.Context) error {
fmt.Println("====> Entry CreateModel function");
var Success bool = false
Db := db.MgoDb{}
Db.Init()
defer Db.Close()
col := Db.C(db.TrainingDataCollection)
var results dataStruct
if err := col.Find(nil).All(&results); err != nil {
fmt.Println("ERROR WHILE GETTING THE TRAINING DATA")
} else {
Success = true
}
Data := make([]float64, len(results*nbAttrs)
/**** Filling the Data var with my dataset data *****/
mat := mat64.NewDense(row,nbAttrs,Data)
inst := base.InstancesFromMat64(row,nbAttrs,mat)
//Selecting the class attribute for our instance
attrs := inst.AllAttributes()
inst.AddClassAttribute(attrs[4])
//Initialise a new KNN classifier
cls := knn.NewKnnClassifier("manhattan","linear",3)
//Training-tessting split
trainData, testData := base.InstancesTrainTestSplit(inst,0.7)
/******* Continue the Model creation ******/
I'll be glad if my answer helps someone.
Thanks a lot #mkopriva for your help !
base.FixedDataGrid is an interface, so what you need to do is to implement that interface, that is, implement all of its methods, on the type you want to use as FixedDataGrid.
Since you want to use []dataStruct, a slice of dataStructs, which is an unnamed type, as FixedDataGrid you will have to declare a new type to be able to add methods to it because you can add methods only to named types. For example something like this:
type dataStructList []dataStruct
Now, if you take a look at the documentation, you can see that the FixedDataGrid interface declares two methods RowString and Size but also embeds another interface, the base.DataGrid interface, which means you need to implement the methods declared by DataGrid as well. So, given your new dataStructList type, you can do something like this:
func (l dataStructList) RowString(int) string { /* ... */ }
func (l dataStructList) Size() (int, int) { /* ... */ }
func (l dataStructList) GetAttribute(base.Attribute) (base.AttributeSpec, error) { /* ... */ }
func (l dataStructList) AllAttributes() []base.Attribute { /* ... */ }
func (l dataStructList) AddClassAttribute(base.Attribute) error { /* ... */ }
func (l dataStructList) RemoveClassAttribute(base.Attribute) error { /* ... */ }
func (l dataStructList) AllClassAttributes() []base.Attribute { /* ... */ }
func (l dataStructList) Get(base.AttributeSpec, int) []byte { /* ... */ }
func (l dataStructList) MapOverRows([]base.AttributeSpec, func([][]byte, int) (bool, error)) error { /* ... */ }
After you've implemented the /* ... */ parts you can then start using dataStructList as a FixedDataGrid, so something like this:
var results []dataStruct
err := col.Find(nil).All(&results)
fdg := dataStructList(results) // you can use fdg as FixedDataGrid
Or
var results dataStructList // you can use results as FixedDataGrid
err := col.Find(nil).All(&results)
Update:
After you've implemented all of those methods on the dataStructList all you need is the type of the results variable inside your function:
func CreateModel(c echo.Context) error {
fmt.Println("====> Entry CreateModel function")
//var results []dataStruct
var Success bool = false
Db := db.MgoDb{}
Db.Init()
defer Db.Close()
col := Db.C(db.TrainingDataCollection)
var results dataStructList // <--- use the type that implements the interface
if err := col.Find(nil).All(&results); err != nil { // <-- pass a pointer to results
fmt.Println("ERROR WHILE GETTING THE TRAINING DATA")
} else {
//fmt.Println("Results All: ", results)
Success = true
}
fmt.Println("=============", results)
//Initialises a new KNN classifier
cls := knn.NewKnnClassifier("euclidean", "linear", 2)
//Do a training-test split
trainData, testData := base.InstancesTrainTestSplit(results, 0.55) // <-- this will work because results if of type dataStructList, which implements the base.FixedDataGrid interface.
cls.Fit(trainData)
//Calculates the Euclidean distance and returns the most popular label
predictions, err := cls.Predict(testData)
if err != nil {
panic(err)
}
fmt.Println(predictions)
// Prints precision/recall metrics
confusionMat, err := evaluation.GetConfusionMatrix(testData, predictions)
if err != nil {
panic(fmt.Sprintf("Unable to get confusion matrix: %s", err.Error()))
}
fmt.Println(evaluation.GetSummary(confusionMat))
return c.JSON(http.StatusOK, Success)
}